Chapter 9: spaCy and Transformers
In this chapter, you will learn about the latest hot topic in NLP, transformers, and how to use them with TensorFlow and spaCy.
First, you will learn about transformers and transfer learning. Second, you'll learn about the architecture details of the commonly used Transformer architecture – Bidirectional Encoder Representations from Transformers (BERT). You'll also learn how BERT Tokenizer and WordPiece algorithms work. Then you will learn how to quickly get started with pre-trained transformer models of the HuggingFace library. Next, you'll practice how to fine-tune HuggingFace Transformers with TensorFlow and Keras. Finally, you'll learn how spaCy v3.0 integrates transformer models as pre-trained pipelines.
By the end of this chapter, you will be completing the statistical NLP topics of this book. You will add your knowledge of transformers to the knowledge of Keras and TensorFlow that you acquired in Chapter 8...