Generating text
In this recipe, we will use a generative transformer model to generate text from a given seed sentence. One such model to generate text is the GPT-2 model, which is an improved version of the original General Purpose Transformer (GPT) model.
Getting ready
As part of this recipe, we will use the pipeline module from the transformers package. You can use the 8.5_Transformer_text_generation.ipynb
notebook from the code site if you need to work from an existing notebook.
How to do it...
In this recipe, we will start with an initial seed sentence and use the GPT-2 model to generate text based on the given seed sentence. We will also tinker with certain parameters to improve the quality of the generated text.
The recipe does the following things:
- It initializes a starting sentence from which a continuing sentence will be generated
- It initializes a GPT-2 model as part of a pipeline and uses it to generate five sentences as part of the parameters...