Understanding word vectors
The invention of word vectors (or word2vec) has been one of the most thrilling advancements in the NLP world. Those of you who are practicing NLP have definitely heard of word vectors at some point. This chapter will help you understand the underlying idea that caused the invention of word2vec, what word vectors look like, and how to use them in NLP applications.
The statistical world works with numbers, and all statistical methods, including statistical NLP algorithms, work with vectors. As a result, while working with statistical methods, we need to represent every real-world quantity as a vector, including text. In this section, we will learn about the different ways we can represent text as vectors and discover how word vectors provide semantic representation for words.
We will start by discovering text vectorization by covering the simplest implementation possible: one-hot encoding.
One-hot encoding
One-hot encoding is a simple and straightforward...