Word vectors are useful building blocks in many applications. They capture and encode the semantic relationships between words. As a consequence, they lead to the transformation of words into a sequence of numbers, forming a dense vector that is well-suited for training deep learning models. In this chapter, we will take a detailed look at the various approaches to building such semantically useful word embeddings.
Word vectors
The classical approach
Traditionally, the approach for building word representations used things such as the bag-of-words model. In this model, word representations considered individual words to be independent of one another. Hence, such representations often used a one-hot representation, which indicated...