In this chapter, we learned about how LSTMs can be used to generate book scripts.
We began by looking at the basics of RNNs and its popular variant, commonly known as LSTMs. We learned that RNNs are hugely successful in predicting datasets that involve sequences such as time series, next word prediction in natural language processing tasks, and so on. We also looked at the advantages and disadvantages of using LSTMs.
This chapter then helped us understand how to pre-process text data and prepare it so that we can feed it into LSTMs. We also looked at the model's structure for training. Next, we looked at how to train the neural networks by creating batches of data.
Finally, we understood how to generate book script using the TensorFlow model we trained. Although the script that was generated doesn't make complete sense, it was amazing to observe...