We started off the chapter by covering what an RNN is and how an RNN differs from a feedforward network. We learned that an RNN is a special type of neural network that is widely applied over sequential data; it predicts output based on not only the current input but also on the previous hidden state, which acts as the memory and stores the sequence of information that the network has seen so far.
We learned how forward propagation works in RNNs, and then we explored a detailed derivation of the BPTT algorithm, which is used for training RNN. Then, we explored RNNs by implementing them in TensorFlow to generate song lyrics. At the end of the chapter, we learned about the different architectures of RNNs, such as one-to-one, one-to-many, many-to-one, and many-to-many, which are used for various applications.
In the next chapter, we will learn about the LSTM cell, which solves...