Overview of transformers and LLMs
Transformers and LLMs are currently the best-performing technologies for natural language understanding (NLU). This does not mean that the approaches covered in earlier chapters are obsolete. Depending on the requirements of a specific NLP project, some of the simpler approaches may be more practical or cost-effective. In this chapter, you will get information about the more recent approaches that you can use to make that decision.
There is a great deal of information about the theoretical aspects of these techniques available on the internet, but here we will focus on applications and explore how these technologies can be applied to solving practical NLU problems.
As we saw in Chapter 10, recurrent neural networks (RNNs) have been a very effective approach in NLP because they don’t assume that the elements of input, specifically words, are independent, and so are able to take into account sequences of input elements such as the order...