Chapter 3: Autoencoding Language Models
In the previous chapter, we looked at and studied how a typical Transformer model can be used by HuggingFace's Transformers. So far, all the topics have included how to use pre-defined or pre-built models and less information has been given about specific models and their training.
In this chapter, we will gain knowledge of how we can train autoencoding language models on any given language from scratch. This training will include pre-training and task-specific training of the models. First, we will start with basic knowledge about the BERT model and how it works. Then we will train the language model using a simple and small corpus. Afterward, we will look at how the model can be used inside any Keras model.
For an overview of what will be learned in this chapter, we will discuss the following topics:
- BERT – one of the autoencoding language models
- Autoencoding language model training for any language
- Sharing...