Chapter 2: A Hands-On Introduction to the Subject
So far, we have had an overall look at the evolution of Natural Language Processing (NLP) using Deep Learning (DL)-based methods. We have learned some basic information about Transformer and their respective architecture. In this chapter, we are going to have a deeper look into how a transformer model can be used. Tokenizers and models, such as Bidirectional Encoder Representations from Transformer (BERT), will be described in more technical detail in this chapter with hands-on examples, including how to load a tokenizer/model and use community-provided pretrained models. But before using any specific model, we will understand the installation steps required to provide the necessary environment by using Anaconda. In the installation steps, installing libraries and programs on various operating systems such as Linux, Windows, and macOS will be covered. The installation of PyTorch and TensorFlow, in two versions of a Central Processing...