A Hands-On Introduction to the Subject
So far, we have had an overall look at the evolution of natural language processing (NLP) using deep learning (DL)-based methods. We have learned some basic information about transformers and their respective architecture. In this chapter, we are going to have a deeper look into how a transformer model can be used. Tokenizers and models, such as bidirectional encoder representations from transformer (BERT), will be described in more technical detail with hands-on examples, including how to load a tokenizer/model and use community-provided pretrained models. However, before using any specific model, we will understand the installation steps required to provide the necessary environment by using Anaconda. In the installation steps, installing libraries and programs on various operating systems such as Linux, Windows, and macOS will be covered. The installation of PyTorch and TensorFlow, with two versions of a central processing unit (CPU) and a...