In this chapter, we covered a very powerful type of neural network—RNNs. We also learned about several variations of the RNN cell, such as LSTM cells and GRUs. Like the neural networks in prior chapters, these too can be extended to deep neural networks, which have several advantages. In particular, they can learn a lot more complex information about sequential data, for example, in language.
In the next chapter, we will learn about attention mechanisms and their increasing popularity in language- and vision-related tasks.