Translations with Trax
Google Brain developed Tensor2Tensor (T2T) to make deep learning development easier. T2T is an extension of TensorFlow and contains a library of deep learning models that contains many Transformer examples.
Though T2T was a good start, Google Brain produced Trax, an end-to-end deep learning library. Trax contains a transformer model that can be applied to translations. The Google Brain team presently maintains Trax.
In this section, we will focus on the minimum functions to initialize the English-German problem described by Vaswani et al. (2017) to illustrate the Transformer's performance.
We will be using preprocessed English and German datasets to show that the Transformer architecture is language-agnostic.
Open Trax_Translation.ipynb
.
We will begin by installing the modules we need.
Installing Trax
Google Brain has made Trax easy to install and run. We will import the basics along with Trax, which can be installed in one line...