GloVE, Lexvec FastText.
One popular alternative to word2vec is GloVe (Global Vectors).
Doc2Vec - Efficient Vector Representation for Documents Through Corruption.
https://openreview.net/pdf?id=B1Igu2ogg
https://github.com/mchen24/iclr2017
Both models learn geometrical encodings (vectors) of words from their co-occurrence information (how frequently they appear together in large text corpora). They differ in that word2vec is a "predictive" model, whereas GloVe is a "count-based" model. See this paper for more on the distinctions between these two approaches: http://clic.cimec.unitn.it/marco
Predictive models learn their vectors in order to improve their predictive ability of Loss(target word | context words; Vectors), that is, the loss of predicting the target words from the context words given the vector representations...