Summary
This chapter focused on yet another exciting field in natural language processing related to text generation. In this context, we examined chatbots as a convenient case study. In addition, the content included many references to previous chapters to urge you to revisit specific topics from a different perspective.
The power of the transformer architecture and the abundance of data has paved the way for more elaborate language models. We presented how to create such a model from scratch or fine-tune a pre-trained model. During this discussion, we also applied a third type of learning: reinforcement learning.
Evaluation metrics are a constant theme throughout this book; this chapter was no exception. We used perplexity as an evaluation metric and discussed TensorBoard, which helps us shed light on the internal mechanics of deep neural networks. Finally, we worked on creating user interfaces in Python.
The next chapter is the final chapter of this book and deals with...