Sequence to sequence (seq2seq) is a particular kind of RNN with successful applications in neural machine translation, text summarization, and speech recognition. In this recipe, we will discuss how to implement a neural machine translation with results similar to the one achieved by the Google Neural Machine Translation system (https://research.googleblog.com/2016/09/a-neural-network-for-machine.html ). The key idea is to input a whole sequence of text, understand the entire meaning, and then output the translation as another sequence. The idea of reading an entire sequence is very different from the previous architectures, where a fixed set of words was translated from one source language into a destination language.
This section is inspired by the 2016 PhD thesis, Neural Machine Translation, by Minh-Thang Luong (https://github...