Summary
In this chapter, we have shown how CNNs can be used to learn from NLP data and how to train one from scratch using PyTorch. While the deep learning methodology is very different to the methodology used within RNNs, conceptually, CNNs use the motivation behind n-gram language models in an algorithmic fashion in order to extract implicit information about words in a sentence from the context of its neighboring words. Now that we have mastered both RNNs and CNNs, we can begin to expand on these techniques in order to construct even more advanced models.
In the next chapter, we will learn how to build models that utilize elements of both convolutional and recurrent neural networks and use them on sequences to perform even more advanced functions, such as text translation. These are known as sequence-to-sequence networks.