General conversational agents
Seq2seq models provide the best inspiration for learning multi-turn general conversations. A useful mental model is that of machine translation. Similar to the machine translation problem, the response to the previous question can be thought of as a translation of that input into a different language – the response. Encoding more context into a conversation can be achieved by passing in a sliding window of the previous conversation turns instead of just the last question/statement. The term open-domain is often used to describe bots in this area as the domain of the conversation is not fixed. The bot should be able to discuss a wide variety of topics. There are several issues that are their own research topics.
Lack of personality or blandness is one such problem. The dialog is very dry. As an example, we have seen the use of a temperature hyperparameter to adjust the predictability of the response in previous chapters. Conversational agents...