Summary
In this chapter, we applied all the knowledge we learned from our recurrent models and our sequence-to-sequence models and combined them with an attention mechanism to construct a fully working chatbot. While conversing with our chatbot is unlikely to be indistinguishable from talking to a real human, with a considerably larger dataset we might hope to achieve an even more realistic chatbot.
Although sequence-to-sequence models with attention were state-of-the-art in 2017, machine learning is a rapidly progressing field and since then, there have been multiple improvements made to these models. In the final chapter, we will discuss some of these state-of-the-art models in more detail, as well as cover several other contemporary techniques used in machine learning for NLP, many of which are still in development.