Introducing seq2seq models
In Chapter 6, we outlined several types of recurrent models, depending on the input/output combinations. One of them is indirect many-to-many, or seq2seq, where an input sequence is transformed into another, different output sequence, not necessarily with the same length as the input. One type of seq2seq task is machine translation. The input sequences are the words of a sentence in one language, and the output sequences are the words of the same sentence translated into another language. For example, we can translate the English sequence tourist attraction to the German Touristenattraktion. Not only is the output of a different length but there is no direct correspondence between the elements of the input and output sequences. One output element corresponds to a combination of two input elements.
Another type of indirect many-to-many task is conversational chatbots such as ChatGPT, where the initial input sequence is the first user query. After that,...