An RNN models sequences, in this case, words. It analyzes anything in a sequence, including images. To speed the mind-dataset process up, data augmentation can be applied here exactly as it is to images in other models. In this case, an RNN will be applied to words.
A first look at its graph data flow structure shows that an RNN is a neural network like the others previously explored. The following graphs were obtained by first running LSTM.py and then Tensorboard_reader.py:
![](https://static.packt-cdn.com/products/9781788990547/graphics/assets/f98db284-bcec-4f27-9ddd-92090f1d1921.jpg)
The y inputs (test data) go to the loss function (Loss_train) . The x inputs (training data) will be transformed through weights and biases into logits with a softmax function. A zoom into the RNN area of the graph shows the following basic_lstm cell:
![](https://static.packt-cdn.com/products/9781788990547/graphics/assets/2e531ea3-050b-4453-960b-24536aac7437.jpg)
What makes an RNN special is to be found in the LSTM cell.
...