Summary
In this chapter, we learned about the concept of attention mechanisms. Based on attention mechanisms, several architectures have been proposed that constitute the state of the art in the NLP world. We learned about one specific model architecture to perform a neural machine translation task. We also briefly mentioned other state-of-the-art architectures such as transformers and BERT.
Up to now, we have seen many different NLP models. In the next chapter, we will look at the flow of a practical NLP project in an organization and related technology.