An RNN models sequences, in this case, words. It analyzes anything in a sequence, including images. To speed the mind-dataset process up, data augmentation can be applied here exactly as it is to images in other models. In this case, an RNN will be applied to words.
A first look at its graph data flow structure shows that an RNN is a neural network like the others previously explored. The following graphs were obtained by first running LSTM.py and then Tensorboard_reader.py:

The y inputs (test data) go to the loss function (Loss_train) . The x inputs (training data) will be transformed through weights and biases into logits with a softmax function. A zoom into the RNN area of the graph shows the following basic_lstm cell:

What makes an RNN special is to be found in the LSTM cell.
...