Hugging Face transformer models
Everything we have learned in this chapter can be condensed into a ready-to-use Hugging Face transformer model.
With Hugging Face, we can implement machine translation in three lines of code!
Open Multi_Head_Attention_Sub_Layer.ipynb
in Google Colaboratory. Save the notebook in your Google Drive (make sure you have a Gmail account). Then, go to the two last cells.
We first ensure that Hugging Face transformers are installed:
!pip -q install transformers
The first cell imports the Hugging Face pipeline that contains several transformer usages:
#@title Retrieve pipeline of modules and choose English to French translation
from transformers import pipeline
We then implement the Hugging Face pipeline, which contains ready-to-use functions. In our case, to illustrate the Transformer model of this chapter, we activate the translator model and enter a sentence to translate from English to French:
translator = pipeline("translation_en_to_fr")
#One line of code!
print(translator("It is easy to translate languages with transformers", max_length=40))
And voilà! The translation is displayed:
[{'translation_text': 'Il est facile de traduire des langues à l'aide de transformateurs.'}]
Hugging Face shows how transformer architectures can be used in ready-to-use models.