Use case: Using BERT to answer questions
Now let’s learn how to implement BERT, train it on a question-answer dataset, and ask the model to answer a given question.
Introduction to the Hugging Face transformers library
We will use the transformers
library built by Hugging Face. The transformers
library is a high-level API that is built on top of TensorFlow, PyTorch, and JAX. It provides easy access to pre-trained Transformer models that can be downloaded and fine-tuned with ease. You can find models in the Hugging Face’s model registry at https://huggingface.co/models. You can filter models by task, examine the underlying deep learning frameworks, and more.
The transformers
library was designed with the aim of providing a very low barrier for entry to using complex Transformer models. For this reason, there’s only a handful of concepts that you need to learn in order to hit the ground running with the library. Three important classes are required to...