Implementing an NMT from scratch – a German to English translator
Now we will implement an actual neural machine translator. We will be implementing the NMT using raw TensorFlow operations variables. The exercise is available in ch10/neural_machine_translation.ipynb
. However, there is a sublibrary in TensorFlow, known as the seq2seq
library. You can read more information about seq2seq
as well as, learn to implement an NMT with seq2seq
in the Appendix, Mathematical Foundations and Advanced TensorFlow.
The reason why we use raw TensorFlow is because, once you learn to implement a machine translator from scratch without using any helper functions, you will be able to quickly learn to use the seq2seq
library. Furthermore, online resources are very scarce for learning to implement sequence-to-sequence models using raw TensorFlow. However, there are numerous resources/tutorials on how to use the seq2seq
library for machine translation.
Note
TensorFlow provides very informative sequence to sequence...