Word2vec is one of the most popular and widely used models for generating the word embeddings. What are word embeddings though? Word embeddings are the vector representations of words in a vector space. The embedding generated by the word2vec model captures the syntactic and semantic meanings of a word. Having a meaningful vector representation of a word helps the neural network to understand the word better.
For instance, let's consider the following text: Archie used to live in New York, he then moved to Santa Clara. He loves apples and strawberries.
Word2vec model generates the vector representation for each of the words in the preceding text. If we project and visualize the vectors in embedding space, we can see how all the similar words are plotted close together. As you can see in the following figure, words apples and strawberries are...