Summarizing text using pre-trained models based on Transformers
We will now explore techniques for performing text summarization. Generating a summary for a long passage of text allows NLP practitioners to extract the relevant information for their use cases and use these summaries for other downstream tasks. As part of the summarization, we will explore recipes that use Transformer models to generate the summaries.
Getting ready
Our first recipe for summarization will use the Google Text-to-Text Transfer Transformer (T5) model for summarization. You can use the 9.5_summarization.ipynb
notebook from the code site if you need to work from an existing notebook.
How to do it
Let’s get started:
- Do the necessary imports:
from transformers import pipeline
- As part of this step, we initialize the input passage that we need to summarize along with the pipeline. We also calculate the length of the passage since this will be used as an argument to be passed to the...