The SimpleRNN model in Keras is a basic RNN layer, like the ones we discussed earlier. While it has many parameters, most of them are set with excellent defaults that will get you by for many different use cases. Since we have initialized the RNN layer as the first layer of our model, we must pass it an input shape, corresponding to the length of each sequence (which we chose to be 40 characters earlier) and the number of unique characters in our dataset (which was 44). While this model is computationally compact to run, it gravely suffers from the vanishing gradients problem we spoke of. As a result, it has some trouble modeling long-term dependencies:
from keras.models import Sequential from keras.layers import Dense, Bidirectional, Dropout from keras.layers import SimpleRNN, GRU, BatchNormalization
from keras.optimizers import RMSprop
'''Fun...