Creating a chatbot using an LLM
In this recipe, we will create a chatbot using the LangChain framework. In the previous recipe, we learned how to ask questions to an LLM based on a piece of content. Though the LLM was able to answer questions accurately, the interaction with the LLM was completely stateless. The LLM looks at each question in isolation and ignores any previous interactions or questions that it was asked. In this recipe, we will use an LLM to create a chat interaction, wherein the LLM will be aware of the previous conversations and use the context from them to answer subsequent questions. Applications of such a framework would be to converse with document sources and get to the right answer by asking a series of questions. These document sources could be of a wide variety of types, from internal company knowledge bases to customer contact center troubleshooting guides. Our goal here is to present a basic step-by-step framework to demonstrate the essential components...