Chapter 5: Working with Word Vectors and Semantic Similarity
Word vectors are handy tools and have been the hot topic of NLP for almost a decade. A word vector is basically a dense representation of a word. What's surprising about these vectors is that semantically similar words have similar word vectors. Word vectors are great for semantic similarity applications, such as calculating the similarity between words, phrases, sentences, and documents. At a word level, word vectors provide information about synonymity, semantic analogies, and more. We can build semantic similarity applications by using word vectors.
Word vectors are produced by algorithms that make use of the fact that similar words appear in similar contexts. To capture the meaning of a word, a word vector algorithm collects information about the surrounding words that the target word appears with. This paradigm of capturing semantics for words by their surrounding words is called distributional semantics.
...