As you have seen, when you need to work with text in machine learning, you need to convert the text into numerical values. The logic is the same in neural architectures as well. In neural networks, you implement this using the embeddings layer. All modern deep learning libraries provide an embeddings API for use.
The embeddings layer is a useful and versatile layer used for various purposes:
- It can be used to learn word embeddings to be used in an application later
- It can be used with a larger model where the embeddings are also tuned as part of the model
- It can be used to load a pretrained word embedding
It is in the third point that will be the focus of this section. The idea is to utilize fastText to create superior embeddings, which can then be injected into your model using this embedding layer. Normally the embeddings layer is initialized with random weights...