Working with RNNs
In this section, we will first provide some contextual information about RNNs. Then, we will highlight some potential drawbacks of classical RNNs. Finally, we will see an improved variation of RNNs called LSTM to address the drawbacks.
Contextual information and the architecture of RNNs
Human beings don't start thinking from scratch; the human mind has so-called persistence of memory, the ability to associate the past with recent information. Traditional neural networks, instead, ignore past events. For example, in a movie scenes classifier, it's not possible for a neural network to use a past scene to classify current ones. RNNs were developed to try to solve this problem:
Figure 1: RNNs have loops
In contrast to conventional neural networks, RNNs are networks with a loop that allows the information to be persistent (Figure 1). In a neural network say, A: at some time t, input xt and outputs a value ht. So from Figure 1, we can think of an RNN as multiple copies of the same...