In this section, we are going to implement the famous word2vec example, which is adding woman and king and subtracting man, and then the resultant vector shows the vector value of queen.
We are not going to train the word2vec model, on our data and then build our own word2vec model because there is a huge amount of data on which Google has already trained their word2vec model and provided us with pre-trained models. Now, if you want to replicate the training process on that much data, then we need a lot of computational resources, so we will use pre-trained word2vec models from Google. You can download the pre-trained model from this link: https://code.google.com/archive/p/Word2vec/.
After clicking on this link, you need to go to the section entitled pre-trained word and phrase vectors, download the model named GoogleNews-vectors-negative300...
After clicking on this link, you need to go to the section entitled pre-trained word and phrase vectors, download the model named GoogleNews-vectors-negative300...