Word embeddings are such an elegant idea that they immediately became an indispensable part of many applications in NLP and other domains. Here are several possible directions for your further exploration:
- You can easily transform the Word Association game into a question-answer system by replacing vectors of words with vectors of sentences. The simplest way to get the sentence vectors is by adding all the word vectors together. Interestingly, such sentence vectors still keep the semantics, so you can use them to find similar sentences.
- Using clustering on embedding vectors, you can separate words, sentences, and documents into groups by similarity.
- As we have mentioned, Word2Vec vectors are popular as parts of the more complex NLP pipelines. For example, you can feed them into a neural network or some other machine learning algorithm. In this way, you...