Current developments
At the time of writing this book, the Technology Innovation Institute (https://www.tii.ae/) has just released its largest model – Falcon 170B. It is the largest fully open source model that is similar to the GPT-3.5 model. It shows the current direction of the research in large language models.
Although GPT-4 exists, which is larger by a factor of 1,000, we can develop very good software with moderately large models such as GPT-3.5. This brings us to some of the current topics that we, as a community, need to discuss. One of them is the energy sustainability of these models. Falcon-170B requires 400 GB of RAM (eight times that of an Nvidia A100 GPU) to execute (according to Hugging Face). We do not know how much hardware the GPT-4 model needs. The amount of electricity that it takes and the resources that it uses must be on par with what we get as value from that model.
We also approach limits to the conventional computational power when it comes...