In this chapter, we discussed RNNs. First, we started with the RNN and backpropagation through time theory. Then, we implemented an RNN from scratch to solidify our knowledge on the subject. Next, we moved on to more complex LSTM and GRU cells using the same pattern: a theoretical explanation, followed by a practical PyTorch implementation. Finally, we combined our knowledge from Chapter 6, Language Modeling, with the new material from this chapter for a full-featured sentiment analysis task implementation.
In the next chapter, we'll discuss seq2seq models and their variations—an exciting new development in sequence processing.