Word2Vec is an efficient algorithm for word embeddings generation based on neural networks. It was originally described by Mikolov et al. in Distributed Representations of Words and Phrases and their Compositionality (2013). The original C implementation in the form of a command-line application is available at https://code.google.com/archive/p/word2vec/.
Word2Vec is often referred to as an instance of deep learning, but the architecture is actually quite shallow: only three layers in depth. This misconception is likely related to its wide adoption for enhancing productivity of deep networks in NLP. The Word2Vec architecture is similar to an autoencoder. The input of the neural network is a sufficiently big text corpus, and the output is a list of vectors (arrays of numbers), one vector for each word in the corpus. The algorithm...