LSTM (long short-term memory) network is an advanced RNN network that aims to solve the vanishing gradient problem and yield excellent results on longer sequences. In the previous chapter, we introduced the GRU network, which is a simpler version of LSTM. Both include memory states that determine what information should be propagated further at each timestep. The LSTM cell looks as follows:
Let's introduce the main equations that will clarify the preceding diagram. They are similar to the ones for gated recurrent units (see Chapter 3, Generating Your Own Book Chapter). Here is what happens at every given timestep, t:
 is the output gate, which determines what exactly is important for the current prediction and what information should be kept around for the future.  is called the input gate, and determines...