Long Short-Term Memory (LSTM)
This model (which represents the state-of-the-art recurrent cell in many fields) was proposed in 1997 by Hochreiter and Schmidhuber (in Hochreiter S., Schmidhuber J., Long Short-Term Memory, Neural Computation, Vol. 9, 11/1997) with the emblematic name Long Short-Term Memory (LSTM). As the name suggests, the idea is to create a more complex artificial recurrent neuron that can be plugged into larger networks and trained without the risk of vanishing and, of course, exploding gradients. One of the key elements of classic recurrent networks is that they are focused on learning, but not on selectively forgetting. This ability is indeed necessary for optimizing the memory in order to remember what is really important and removing all those pieces of information that are not necessary to predict new values.
To achieve this goal, LSTM exploits two important features (it's helpful to discuss them before moving on to the model). The first one is an explicit...