Long Short-Term Memory
LSTM is a type of RNN that's designed to solve the long-dependency problem. It can remember values for long or short time periods. The principal way it differs from traditional RNNs is that they include a cell or a loop to store the memory internally.
This type of neural network was created in 1997 by Hochreiter and Schmidhuber. This is the basic schema of an LSTM neuron:
As you can see in the previous figure, the schema of an LSTM neuron is complex. It has three types of gate:
Input gate: Allows us to control the input values to update the state of the memory cell.
Forget gate: Allows us to erase the content of the memory cell.
Output gate: Allows us to control the returned values of the input and cell memory content.
An LSTM model in Keras has a three-dimensional input:
Sample: Is the amount of data you have (quantity of sequences).
Time step: Is the memory of your network. In other words, it stores previous information in order to make...