In this section, we will have a look at various kinds of RNN architectures. We will also understand how they differ from each other based on their nature and implementations.
Exploring RNN architectures
LSTM
Long short-term memory (LSTM) is a special kind of RNN architecture that's capable of learning long-term dependencies. It was introduced by Sepp Hochreiter and Jürgen Schmidhuber in 1997 and was then improved on and presented in the works of many other researchers. It perfectly solves many of the various problems we've discussed, and are now widely used.
In LSTM, each cell has a memory cell and three gates (filters): an input gate, an output gate, and a forgetting gate. The purpose of these gates is to...