Summary
You have completed an exhaustive chapter about a very hot topic in NLP. Congratulations! In this chapter, you started by learning what sort of models transformers are and what transfer learning is. Then, you learned about the commonly used Transformer architecture, BERT. You learned the architecture details and the specific input format, as well as the BERT Tokenizer and WordPiece algorithm.
Next, you became familiar with BERT code by using the popular HuggingFace Transformers library. You practiced fine-tuning BERT on a custom dataset for a sentiment analysis task with TensorFlow and Keras. You also practiced using pre-trained HuggingFace pipelines for a variety of NLP tasks, such as text classification and question answering. Finally, you explored the spaCy and Transformers integration of the new spaCy release, spaCy v3.0.
By the end of this chapter, you had completed the statistical NLP sections of this book. Now you're ready to put everything you learned together...