Identifying challenges with LLM solutions
Despite their impressive capabilities, LLMs face challenges when solving complex real-world problems. In this section, we will explore some of the challenges faced by LLM solutions and discuss possible ways to tackle them. We will explore challenges by high-level groups, as follows:
- Output and input limitations:
- LLMs just produce text: Text output can help provide value for a lot of businesses. However, many other use cases require predictions and recommendations in entirely different formats.
- The context size of an LLM is limited: The issue is that with a large input size, you need exponentially more compute resources to train and predict. So, context size usually stays in a token range of one to three thousand. This issue should be prevalent only for use cases that require long context, as a few thousand context sizes should be enough for most use cases.
- An LLM is a text-specific model: Other data modalities are not supported by default...