Building Conversational Applications
With this chapter, we embark on the hands-on section of this book, with our first concrete implementation of LLM-powered applications. Throughout this chapter, we will cover a step-by-step implementation of a conversational application, using LangChain and its components, building on the knowledge you’ve gained from the previous chapters. By the end of this chapter, you will be able to set up your own conversational application project with just a few lines of code.
We will cover the following key topics:
- Configuring the schema of a simple chatbot
- Adding the memory component
- Adding non-parametric knowledge
- Adding tools and making the chatbot “agentic”
- Developing the front-end with Streamlit