Translations with Google Trax
Google Brain developed Tensor2Tensor (T2T) to make deep learning development easier. T2T is an extension of TensorFlow and contains a library of deep learning models, including many transformer examples.
Although T2T was a good start, Google Brain then produced Trax, an end-to-end deep learning library. Trax contains a Transformer model that can be applied to translations. The Google Brain team presently maintains Trax.
This section will focus on the minimum functions to initialize the English-German problem described by Vaswani et al. (2017), illustrating the Original Transformer’s performance.
We will use preprocessed English and German datasets to show that the transformer architecture is language-agnostic.
Open Trax_Google_Translate.ipynb
. We will begin by installing the modules we need.
Installing Trax
Google Brain has made Trax easy to install and run. We will import the basics along with Trax, which can be installed...