Chapter 4:Autoregressive and Other Language Models
We looked at details of Autoencoder (AE) language models in Chapter 3, Autoencoding Language Models, and studied how an AE language model can be trained from scratch. In the current chapter, you will see theoretical details of Autoregressive (AR) language models and learn how to pre-train them on your own corpus. You will learn how to pre-train any language model such as Generated Pre-trained Transformer 2 (GPT-2) on your own text and use it in various tasks such as Natural Language Generation (NLG). You will understand the basics of a Text-to-Text Transfer Transformer (T5) model and train a Multilingual T5 (mT5) model on your own Machine Translation (MT) data. After finishing this chapter, you will have an overview of AR language models and their various use cases in text2tex
t applications, such as summarization, paraphrasing, and MT.
The following topics will be covered in this chapter:
- Working with AR language models...