Getting Started with LangChain
In this book, we’ll write a lot of code and test many different integrations and tools. Therefore, in this chapter, we’ll give basic setup instructions for all the libraries needed with the most common dependency management tools such as Docker, Conda, pip, and Poetry. This will ensure that you can run all the practical examples in this book.
Next, we’ll go through model integrations that we can use such as OpenAI’s ChatGPT, models on Hugging Face, Jina AI, and others. Further, we’ll introduce, set up, and work with a few providers in turn. For each of them, we will show how to get an API key token.
In the end, as a practical example, we’ll go through an example of a real-world application, an LLM app that could help customer service agents, one of the main areas where LLMs could prove to be game-changing. This will give us a bit more context around using LangChain, and we can introduce tips and tricks...