Text summarization with T5
NLP summarizing tasks extract succinct parts of a text. This section will start by presenting the Hugging Face resources we will use in this chapter. Then we will initialize a T5-large transformer model. Finally, we will see how to use T5 to summarize any document, including legal and corporate documents.
Let’s begin by introducing Hugging Face’s framework.
Hugging Face
Hugging Face designed a framework to implement Transformers at a higher level. We used Hugging Face to fine-tune a BERT model in Chapter 3, Fine-Tuning BERT Models, and train a RoBERTa model in Chapter 4, Pretraining a RoBERTa Model from Scratch.
To expand our knowledge, we needed to explore other approaches, such as Trax, in Chapter 6, Machine Translation with the Transformer, and OpenAI’s models, in Chapter 7, The Rise of Suprahuman Transformers with GPT-3 Engines. This chapter will use Hugging Face’s framework again and explain more about the...