Since every RNN unit we use also has an output, we can train RNN sequences to predict other sequences of variable length. For this recipe, we will take advantage of this fact to create an English-to-German translation model.
Creating sequence-to-sequence models
Getting ready
For this recipe, we will attempt to build a language translation model to translate from English to German.
TensorFlow has a built-in model class for sequence-to-sequence training. We will illustrate how to train and use it on downloaded English-German sentences. The data we will use comes from a compiled zip file at http://www.manythings.org/, which compiles the data from the Tatoeba Project (http://tatoeba.org/home). This data is a tab-delimited English...