The most popular names in word embedding are word2vec by Google (Mikolov) and GloVe by Stanford (Pennington, Socher, and Manning). fastText seems to be fairly popular for multilingual sub-word embeddings.
We advise that you don't use word2vec or GloVe. Instead, use fastText vectors, which are much better and from the same authors. word2vec was introduced by T. Mikolov et. al. (https://scholar.google.com/citations?user=oBu8kMMAAAAJ&hl=en) when he was with Google, and it performs well on word similarity and analogy tasks.
GloVe was introduced by Pennington, Socher, and Manning from Stanford in 2014 as a statistical approximation for word embedding. The word vectors are created by the matrix factorization of word-word co-occurrence matrices.
If picking between the lesser of two evils, we recommend using GloVe over word2vec. This is because GloVe outperforms...