Tag: word2vec

  • Word2Vec and Glove Vectors

    Last time, we saw how autoencoders are used to learn a latent embedding space: an alternative, low-dimensional representation of a set of data with some appealing properties: for example, we saw that interpolating in the latent space is a way of generating new examples. In particular, interpolation in the latent space generates more compelling examples than, […]