Cloud-based LLMs
Recently, there have been a number of cloud-based pretrained large language models that have shown very impressive performance because they have been trained on very large amounts of data. In contrast to BERT, they are too large to be downloaded and used locally. In addition, some are closed and proprietary and can’t be downloaded for that reason. These newer models are based on the same principles as BERT, and they have shown a very impressive performance. This impressive performance is due to the fact that these models have been trained with much larger amounts of data than BERT. Because they cannot be downloaded, it is important to keep in mind that they aren’t appropriate for every application. Specifically, if there are any privacy or security concerns regarding the data, it may not be a good idea to send it to the cloud for processing. Some of these systems are GPT-2, GPT-3, GPT-4, ChatGPT, and OPT-175B, and new LLMs are being published on a frequent...