In Chapter 1, The Nuts and Bolts of Neural Networks, and Chapter 2, Understanding Convolutional Networks, we took an in-depth look at the properties of general feedforward networks and their specialized incarnation, Convolutional Neural Networks (CNNs). In this chapter, we'll close this story arc with Recurrent Neural Networks (RNNs). The NN architectures we discussed in the previous chapters take in a fixed-sized input and provide a fixed-sized output. RNNs lift this constraint with their ability to process input sequences of a variable length by defining a recurrent relationship over these sequences (hence the name). If you are familiar with some of the topics that will be discussed in this chapter, you can skip them.
In this chapter, we will cover the following topics:
- Introduction to RNNs
- Introducing long short-term memory
- Introducing...