Stacking multiple LSTM Layers
Just like we can increase the depth of neural networks or CNNs, we can increase the depth of RNN networks. In this recipe we apply a three layer deep LSTM to improve our Shakespeare language generation.
Getting ready
We can increase the depth of recurrent neural networks by stacking them on top of each other. Essentially, we will be taking the target outputs and feeding them into another network.To get an idea of how this might work for just two layers, see the following figure:
TensorFlow allows easy implementation of multiple layers with a MultiRNNCell()
function that accepts a list of RNN cells.With this behavior, it is easy to create a multi-layer RNN from one cell in Python with MultiRNNCell([rnn_cell]*num_layers)
.
For this recipe, we will perform the same Shakespeare prediction that...