Generating text with RNNs
Now let's look at our first example of using an RNN for an interesting task. In this exercise, we will be using an RNN to generate a fairy tale story! This is a one-to-one RNN problem. We will train a single layer RNN on a collection of fairy tales and ask the RNN to generate a new story. For this task, we will use a small text corpus of 20 different tales (which we will increase later). This example also will highlight one of the crucial limitations of RNNs: the lack of persisting long-term memory. This exercise is available in rnn_language_bigram.ipynb
in the ch6
folder.
Defining hyperparameters
First, we will define several hyperparameters needed for our RNN, as shown here:
The number of unrolls to perform at one time step. This is the number of steps that the input unrolled for, as discussed in the TBPTT method (T in the Truncated BPTT – training RNNs efficiently section). The higher this number is, the longer the RNN's memory is. However, due to the vanishing...