Transformers and TensorFlow
In this section, we'll dive into transformers code with TensorFlow. Pre-trained transformer models are provided to the developer community as open source by many organizations, including Google (https://github.com/google-research/bert), Facebook (https://github.com/pytorch/fairseq/blob/master/examples/language_model/README.md), and HuggingFace (https://github.com/huggingface/transformers). All the listed organizations offer pre-trained models and nice interfaces to integrate transformers into our Python code. The interfaces are compatible with either PyTorch or Tensorflow or both.
Throughout this chapter, we'll be using HuggingFace's pre-trained transformers and their TensorFlow interface to the transformer models. HuggingFace is an AI company with a focus on NLP and quite devoted to open source. In the next section, we'll take a closer look at what is available in HuggingFace Transformers.
HuggingFace Transformers
In the first...