In this chapter, we focused on text generation and text summarization. Using GRU and RNN, we illustrated an example text generation model that can generate Linux kernel code. Such models, when applied to different domains or source input texts, can help us to understand the underlying structure and context. Next, we described the different types of text summarization. We explained a simple extractive summarization approach, using gensim to generate product review summaries. While extractive summarization reproduces words from the source text, abstractive summarization can generate novel and intuitive summaries.
To cover abstractive summarization, we introduced an encoder-decoder model, using GRU and RNN to summarize news text. We used CNN news text as input data to produce short summaries of the news. Finally, we looked at some state of the art approaches to improve upon...