Neural networks require inputs only in numbers. So when we have textual data, we convert them into numeric or vector representation and feed it to the network. There are various methods for converting the input text to numeric form. Some of the popular methods include term frequency-inverse document frequency (tf-idf), bag of words (BOW), and so on. However, these methods do not capture the semantics of the word. This means that these methods will not understand the meaning of the words.
In this chapter, we will learn about an algorithm called word2vec which converts the textual input to a meaningful vector. They learn the semantic vector representation for each word in the given input text. We will start off the chapter by understanding about word2vec model and two different types of word2vec model called continuous bag-of-words (CBOW) and skip-gram...