Summary
In this chapter, we worked on three NLP projects: sentiment analysis, stock price prediction, and text generation using RNNs. We started with a detailed explanation of the recurrent mechanism and different RNN structures for different forms of input and output sequences. You also learned how LSTM improves vanilla RNNs.
In the next chapter, we will focus on the Transformer, a recent state-of-the-art sequential learning model, and generative models.