Now that you have taken a look at creating models in the command line, you might be wondering how fastText creates those word representations. In this chapter, you will get to know what happens behind the scenes and the algorithms that power fastText.
We will cover the following topics in this chapter:
- Word-to-vector representations
- Types of word representations
- Getting vector representations from text
- Model architecture in fastText
- The unsupervised model
- fastText skipgram implementation
- CBOW (Continuous bag of words)
- Comparison between skipgram and CBOW
- Loss functions and optimizations
- Softmax
- Context definitions