Introduction
In this chapter, we will learn various recipes on how to create recurrent neural networks (RNNs) using Keras. First, we will understand the need for RNN. We will start with the simple RNNs followed by long short-term memory (LSTM)Â RNNs (these networks remember the state over a long period of time because of special gates in the cell).
The need for RNNs
Traditional neural networks cannot remember their past interactions, and that is a significant shortcoming. RNNs address this issue. They are networks with loops in them, allowing information to persist. RNNs have loops. In the next diagram, a chunk of the neural network, A, looks at some input, xt, and outputs a value, ht. A loop in the network allows information to be passed from one step of the network to the next.
This diagram shows what a neural network looks like: