Text Generation
In Chapter 9, Recurrent Neural Networks, you were introduced to natural language processing (NLP) and text generation (also known as language modeling), as you worked with some sequential data problems. In this section, you will be extending your sequence model for text generation using the same dataset to generate extended headlines.
Previously in this book, you saw that sequential data is data in which each point in the dataset is dependent on the point prior and the order of the data is important. Recall the example with the bag of words from Chapter 9, Recurrent Neural Networks. With the bag-of-words approach, you simply used a set of word counts to derive meaning from their use. As you can see in Figure 11.1, these two sentences have completely opposite semantic meanings, but would be identical in a bag-of-words format. While this may be an effective strategy for some problems, it's not an ideal approach for predicting the next word or words.