Text summarization with T5
NLP summarizing tasks extract succinct parts of a text. In this section, we will start by presenting the Hugging Face resources we will use in this chapter. Then we will initialize a T5-large
transformer model. Finally, we will see how to use T5 to summarize any type of document, including legal and corporate documents.
Let's begin by using Hugging Face's framework.
Hugging Face
Hugging Face designed a framework to implement Transformers at a higher level. We used Hugging Face to fine-tune a BERT model in Chapter 2, Fine-Tuning BERT Models, and to train a RoBERTa model in Chapter 3, Pretraining a RoBERTa Model from Scratch.
However, we needed to explore other approaches, such as Trax
, in Chapter 5, Machine Translation with the Transformer, and OpenAI's GitHub repository in Chapter 6, Text Generation with OpenAI GPT-2 and GPT-3 Models.
In this chapter, we will use Hugging Face's framework again and explain more about...