Long short term memory — LSTM
The LSTM is a variant of RNN that is capable of learning long term dependencies. LSTMs were first proposed by Hochreiter and Schmidhuber and refined by many other researchers. They work well on a large variety of problems and are the most widely used type of RNN.
We have seen how the SimpleRNN uses the hidden state from the previous time step and the current input in a tanh layer to implement recurrence. LSTMs also implement recurrence in a similar way, but instead of a single tanh layer, there are four layers interacting in a very specific way. The following diagram illustrates the transformations that are applied to the hidden state at time step t:
The diagram looks complicated, but let us look at it component by component. The line across the top of the diagram is the cell state c, and represents the internal memory of the unit. The line across the bottom is the hidden state, and the i, f, o, and g gates are the mechanism by which the LSTM works around the...