We used Long Short-Term Memory (LSTMs) and Gated Recurrent Units (GRUs) in previous chapters for text classification. In addition to being used for predictive tasks, RNNs can be used to create generative models, as well. RNNs can learn long-term dependencies from an input text, and can therefore generate completely new sequences. This generative model can be either character or word-based. In the next section, we will look at a simple word-based text generation model.
Generating text using RNNs
Generating Linux kernel code with a GRU
We will now look at a simple, fun example, to generate Linux kernel code using an RNN. The complete Jupyter Notebook for this example is available in the book's code repository, under Chapter08...