Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Generative AI with LangChain
Generative AI with LangChain

Generative AI with LangChain: Build large language model (LLM) apps with Python, ChatGPT, and other LLMs

Arrow left icon
Profile Icon Ben Auffarth
Arrow right icon
Free Trial
Full star icon Full star icon Full star icon Full star icon Empty star icon 4 (34 Ratings)
Paperback Dec 2023 368 pages 1st Edition
eBook
NZ$40.99 NZ$58.99
Paperback
NZ$73.99
Subscription
Free Trial
Arrow left icon
Profile Icon Ben Auffarth
Arrow right icon
Free Trial
Full star icon Full star icon Full star icon Full star icon Empty star icon 4 (34 Ratings)
Paperback Dec 2023 368 pages 1st Edition
eBook
NZ$40.99 NZ$58.99
Paperback
NZ$73.99
Subscription
Free Trial
eBook
NZ$40.99 NZ$58.99
Paperback
NZ$73.99
Subscription
Free Trial

What do you get with a Packt Subscription?

Free for first 7 days. $19.99 p/m after that. Cancel any time!
Product feature icon Unlimited ad-free access to the largest independent learning library in tech. Access this title and thousands more!
Product feature icon 50+ new titles added per month, including many first-to-market concepts and exclusive early access to books as they are being written.
Product feature icon Innovative learning tools, including AI book assistants, code context explainers, and text-to-speech.
Product feature icon Thousands of reference materials covering every tech concept you need to stay up to date.
Subscribe now
View plans & pricing
Table of content icon View table of contents Preview book icon Preview Book

Generative AI with LangChain

Questions

If you've read and understood this chapter, you should be able to answer these questions:

  1. What is a generative model?
  2. Which applications exist for generative models?
  3. What's a large language model (LLM) and what does it do?
  4. How can we get bigger performance from LLMs?
  5. What are the conditions that make these models possible?
  6. Which companies and organizations are the big players in developing LLMs?
  7. What is a transformer and what does it consist of?
  8. What does GPT mean?
  9. How does stable diffusion work?
  10. How is stable diffusion trained?

If you struggle to answer these questions, please refer back to the corresponding sections in this chapter to make sure you've understood the material.

LangChain for LLM Apps

Large Language Models (LLMs) like GPT-4 have demonstrated immense capabilities in generating human-like text. However, simply accessing LLMs via APIs has limitations. Instead, combining them with other data sources and tools can enable more powerful applications. In this chapter, we will introduce LangChain as a way to overcome LLM limitations and build innovative language-based applications. We aim to demonstrate the potential of combining recent AI advancements with a robust framework like LangChain.

We will start by outlining some challenges faced when using LLMs on their own, like the lack of external knowledge, incorrect reasoning, and the inability to take action. LangChain provides solutions to these issues through different integrations and off-the-shelf components for specific tasks. We will walk through examples of how developers can use LangChain’s capabilities to create customized natural language processing solutions, outlining the components...

Going beyond stochastic parrots

LLMs have gained significant attention and popularity due to their ability to generate human-like text and understand natural language, which makes them useful in scenarios that revolve around content generation, text classification, and summarization. However, their apparent fluency obscures serious deficiencies that constrain real-world utility. The concept of stochastic parrots helps to elucidate this fundamental issue.

Stochastic parrots refers to LLMs that can produce convincing language but lack any true comprehension of the meaning behind words. Coined by researchers Emily Bender, Timnit Gebru, Margaret Mitchell, and Angelina McMillan-Major in their influential paper On the Dangers of Stochastic Parrots (2021), the term critiques models that mindlessly mimic linguistic patterns. Without being grounded in the real world, models can produce responses that are inaccurate, irrelevant, unethical, or make little logical sense.

Simply scaling...

What is LangChain?

Created in 2022 by Harrison Chase, LangChain is an open-source Python framework for building LLM-powered applications. It provides developers with modular, easy-to-use components for connecting language models with external data sources and services. The project has attracted millions in venture capital funding from the likes of Sequoia Capital and Benchmark, who supplied funding to Apple, Cisco, Google, WeWork, Dropbox, and many other successful companies.

LangChain simplifies the development of sophisticated LLM applications by providing reusable components and pre-assembled chains. Its modular architecture abstracts access to LLMs and external services into a unified interface. Developers can combine these building blocks to carry out complex workflows.

Building impactful LLM apps involves challenges like prompt engineering, bias mitigation, productionizing, and integrating external data. LangChain reduces this learning curve through its abstractions...

Exploring key components of LangChain

Chains, agents, memory, and tools enable the creation of sophisticated LLM applications that go beyond basic API calls to a single LLM. In the following dedicated subsections on these key concepts, we’ll consider how they enable the development of capable systems by combining language models with external data and services.

We won’t dive into implementation patterns in this chapter; however, we will discuss in more detail what some of these components are good for. By the end, you should have the level of understanding that’s required to architect systems with LangChain. Let’s start with chains!

What are chains?

Chains are a critical concept in LangChain for composing modular components into reusable pipelines. For example, developers can put together multiple LLM calls and other components in a sequence to create complex applications for things like chatbot-like social interactions, data extraction, and...

How does LangChain work?

The LangChain framework simplifies building sophisticated LLM applications by providing modular components that facilitate connecting language models with other data and services. The framework organizes capabilities into modules spanning from basic LLM interaction to complex reasoning and persistence.

These components can be combined into pipelines also called chains that sequence the following actions:

  • Loading documents
  • Embedding for retrieval
  • Querying LLMs
  • Parsing outputs
  • Writing memory

Chains match modules to application goals, while agents leverage chains for goal-directed interactions with users. They repeatedly execute actions based on observations, plan optimal logic chains, and persist memory across conversations.

The modules, ranging from simple to advanced, are:

  • LLMs and chat models: Provide interfaces to connect and query language models like GPT-3. Support async, streaming, and batch...

Comparing LangChain with other frameworks

LLM application frameworks have been developed to provide specialized tooling that can harness the power of LLMs effectively to solve complex problems. A few libraries have emerged that meet the requirements of effectively combining generative AI models with other tools to build LLM applications.

There are several open-source frameworks for building dynamic LLM applications. They all offer value in developing cutting-edge LLM applications. This graph shows their popularity over time (data source: GitHub star history, https://star-history.com/):

Figure 2.11: Comparison of popularity between different frameworks in Python

We can see the number of stars on GitHub over time for each project. Haystack is the oldest of the compared frameworks, having started in early 2020 (as per the earliest GitHub commits). It is also the least popular in terms of stars on GitHub. LangChain, LlamaIndex (previously called GPTIndex), and SuperAGI...

Summary

LLMs produce convincing language but have significant limitations in terms of reasoning, knowledge, and access to tools. The LangChain framework simplifies the building of sophisticated applications powered by LLMs that can mitigate these shortcomings. It provides developers with modular, reusable building blocks like chains for composing pipelines and agents for goal-oriented interactions. These building blocks fit together as LLM apps that come with extended capabilities.

As we saw in this chapter, chains allow sequencing calls to LLMs, databases, APIs, and more to accomplish multi-step workflows. Agents leverage chains to take actions based on observations for managing dynamic applications. Memory persists information across executions to maintain state. Together, these concepts enable developers to overcome the limitations of individual LLMs by integrating external data, actions, and context. In other words, LangChain reduces complex orchestration into customizable...

Questions

Please see if you can come up with answers to these questions. I’d recommend you go back to the corresponding sections of this chapter if you are unsure about any of them:

  1. What are the limitations of LLMs?
  2. What are stochastic parrots?
  3. What are LLM applications?
  4. What is LangChain and why should you use it?
  5. What are LangChain’s key features?
  6. What is a chain in LangChain?
  7. What is an agent?
  8. What is memory and why do we need it?
  9. What kind of tools are available in LangChain?
  10. How does LangChain work?

Join our community on Discord

Join our community’s Discord space for discussions with the authors and other readers:

https://packt.link/lang

Left arrow icon Right arrow icon
Download code icon Download Code

Key benefits

  • Learn how to leverage LangChain to work around LLMs’ inherent weaknesses
  • Delve into LLMs with LangChain and explore their fundamentals, ethical dimensions, and application challenges
  • Get better at using ChatGPT and GPT models, from heuristics and training to scalable deployment, empowering you to transform ideas into reality

Description

ChatGPT and the GPT models by OpenAI have brought about a revolution not only in how we write and research but also in how we can process information. This book discusses the functioning, capabilities, and limitations of LLMs underlying chat systems, including ChatGPT and Gemini. It demonstrates, in a series of practical examples, how to use the LangChain framework to build production-ready and responsive LLM applications for tasks ranging from customer support to software development assistance and data analysis – illustrating the expansive utility of LLMs in real-world applications. Unlock the full potential of LLMs within your projects as you navigate through guidance on fine-tuning, prompt engineering, and best practices for deployment and monitoring in production environments. Whether you're building creative writing tools, developing sophisticated chatbots, or crafting cutting-edge software development aids, this book will be your roadmap to mastering the transformative power of generative AI with confidence and creativity.

Who is this book for?

The book is for developers, researchers, and anyone interested in learning more about LangChain. Whether you are a beginner or an experienced developer, this book will serve as a valuable resource if you want to get the most out of LLMs using LangChain. Basic knowledge of Python is a prerequisite, while prior exposure to machine learning will help you follow along more easily.

What you will learn

  • Create LLM apps with LangChain, like question-answering systems and chatbots
  • Understand transformer models and attention mechanisms
  • Automate data analysis and visualization using pandas and Python
  • Grasp prompt engineering to improve performance
  • Fine-tune LLMs and get to know the tools to unleash their power
  • Deploy LLMs as a service with LangChain and apply evaluation strategies
  • Privately interact with documents using open-source LLMs to prevent data leaks

Product Details

Country selected
Publication date, Length, Edition, Language, ISBN-13
Publication date : Dec 22, 2023
Length: 368 pages
Edition : 1st
Language : English
ISBN-13 : 9781835083468
Category :
Languages :
Tools :

What do you get with a Packt Subscription?

Free for first 7 days. $19.99 p/m after that. Cancel any time!
Product feature icon Unlimited ad-free access to the largest independent learning library in tech. Access this title and thousands more!
Product feature icon 50+ new titles added per month, including many first-to-market concepts and exclusive early access to books as they are being written.
Product feature icon Innovative learning tools, including AI book assistants, code context explainers, and text-to-speech.
Product feature icon Thousands of reference materials covering every tech concept you need to stay up to date.
Subscribe now
View plans & pricing

Product Details

Publication date : Dec 22, 2023
Length: 368 pages
Edition : 1st
Language : English
ISBN-13 : 9781835083468
Category :
Languages :
Tools :

Packt Subscriptions

See our plans and pricing
Modal Close icon
$19.99 billed monthly
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Simple pricing, no contract
$199.99 billed annually
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just NZ$7 each
Feature tick icon Exclusive print discounts
$279.99 billed in 18 months
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just NZ$7 each
Feature tick icon Exclusive print discounts

Frequently bought together


Stars icon
Total NZ$ 228.97
Machine Learning with PyTorch and Scikit-Learn
NZ$80.99
Modern Generative AI with ChatGPT and OpenAI Models
NZ$73.99
Generative AI with LangChain
NZ$73.99
Total NZ$ 228.97 Stars icon

Table of Contents

12 Chapters
What Is Generative AI? Chevron down icon Chevron up icon
LangChain for LLM Apps Chevron down icon Chevron up icon
Getting Started with LangChain Chevron down icon Chevron up icon
Building Capable Assistants Chevron down icon Chevron up icon
Building a Chatbot Like ChatGPT Chevron down icon Chevron up icon
Developing Software with Generative AI Chevron down icon Chevron up icon
LLMs for Data Science Chevron down icon Chevron up icon
Customizing LLMs and Their Output Chevron down icon Chevron up icon
Generative AI in Production Chevron down icon Chevron up icon
The Future of Generative Models Chevron down icon Chevron up icon
Other Books You May Enjoy Chevron down icon Chevron up icon
Index Chevron down icon Chevron up icon

Customer reviews

Top Reviews
Rating distribution
Full star icon Full star icon Full star icon Full star icon Empty star icon 4
(34 Ratings)
5 star 55.9%
4 star 17.6%
3 star 2.9%
2 star 14.7%
1 star 8.8%
Filter icon Filter
Top Reviews

Filter reviews by




Josep Oriol Oct 13, 2023
Full star icon Full star icon Full star icon Full star icon Full star icon 5
This book is as up-to-date as it can be, with lots of helpful code examples, and covering all aspects of LLM the development pipeline. It's my work companion, way more useful than official LangChain documentation. A must for everyone involved in LLMOps
Subscriber review Packt
Kam F Siu Jan 30, 2024
Full star icon Full star icon Full star icon Full star icon Full star icon 5
Feefo Verified review Feefo
Andrew McVeigh May 01, 2024
Full star icon Full star icon Full star icon Full star icon Full star icon 5
i'm only 100 pages into this book, but boy is it well phrased and extremely clear. i write apps around LLMs, including RAG architectures. perhaps it's just the current state of my learning, but i've found this book to be extremely helpful and very logically organized. I'll revisit this review once i'm through the entire book, but so far 10/10. it's easily the best and most self-contained book i have on the subject.
Amazon Verified review Amazon
F. P. Dec 22, 2023
Full star icon Full star icon Full star icon Full star icon Full star icon 5
During my learning journey into large language model (LLM) development, I encountered several challenges:- The difficulty of providing precise instructions within specific contexts, which I found to be the most challenging and crucial aspect.- Switching between different LLM models with minimal programming effort.- Selectively saving chat history in memory.- Handling data efficiently, including managing input data of various modalities and making output data accessible.In overcoming these obstacles, I came across LangChain, a robust toolkit designed for LLM application development. The book "Generative AI with LangChain" by Ben Auffarth provides a comprehensive overview, covering the basics of LLM, LangChain, and its key components (chains, agents, memory, tools). The book also explores sample applications such as chatbots, customization of LLM models (conditioning, fine-tuning), and the deployment of LLM apps into production. Unlike theoretical research materials, this book serves as a practical, one-stop resource for understanding the current landscape of LLM applications.Some of the interesting points:- LangChain helps standardize prompts by providing prompt templates (LangChain Expression Language).- LangChain provides extensive integrations to other model APIs including Fake LLM, OpenAI, Hugging Face, GCP, Jina AI, Replicate, etc.- LangChain has "memory" which allows the model to be context-aware.- LangChain supports advanced data facilities such as map-reduce approach and output parser.This book has significantly saved me time, providing consolidated information without the need for extensive online searches or inquiries to ChatGPT. For those unsure about its content, I recommend checking out the free sample on Amazon – it's undoubtedly worth every penny.
Amazon Verified review Amazon
hawkinflight Jan 05, 2024
Full star icon Full star icon Full star icon Full star icon Full star icon 5
I have not used LangChain before, and I am looking at this book to learn how to create an LLM app. I am really looking forward to trying it out for all three types of apps covered in the book - assistants/chatbot, code generation, and data science. The book is clear and straight to the point, so I expect to be able to try these out fairly quickly. I have gotten through the "setting up the dependencies" section. I cloned the book's github repo, and I tried three methods for variety's sake to create a python environment: pip, conda, and Docker, all on Windows, and I believe I have them all set up. I hit some bumps, but I was able to follow the onscreen error messages and get past them. For pip, I needed to install MSFT Build Tools to get C++. For the conda case, I had to modify the yaml file for two of the packages - ncurses and readline, which have different names for Windows. In Chapter 2 there is a comparison of LangChain with other frameworks, from which you get a feel that choosing LangChain at this moment is the best choice. I am happy to have found this book, and I can't wait to proceed w/the next steps. It's a lot of fun to be able to interact w/LLMs.
Amazon Verified review Amazon
Get free access to Packt library with over 7500+ books and video courses for 7 days!
Start Free Trial

FAQs

What is included in a Packt subscription? Chevron down icon Chevron up icon

A subscription provides you with full access to view all Packt and licnesed content online, this includes exclusive access to Early Access titles. Depending on the tier chosen you can also earn credits and discounts to use for owning content

How can I cancel my subscription? Chevron down icon Chevron up icon

To cancel your subscription with us simply go to the account page - found in the top right of the page or at https://subscription.packtpub.com/my-account/subscription - From here you will see the ‘cancel subscription’ button in the grey box with your subscription information in.

What are credits? Chevron down icon Chevron up icon

Credits can be earned from reading 40 section of any title within the payment cycle - a month starting from the day of subscription payment. You also earn a Credit every month if you subscribe to our annual or 18 month plans. Credits can be used to buy books DRM free, the same way that you would pay for a book. Your credits can be found in the subscription homepage - subscription.packtpub.com - clicking on ‘the my’ library dropdown and selecting ‘credits’.

What happens if an Early Access Course is cancelled? Chevron down icon Chevron up icon

Projects are rarely cancelled, but sometimes it's unavoidable. If an Early Access course is cancelled or excessively delayed, you can exchange your purchase for another course. For further details, please contact us here.

Where can I send feedback about an Early Access title? Chevron down icon Chevron up icon

If you have any feedback about the product you're reading, or Early Access in general, then please fill out a contact form here and we'll make sure the feedback gets to the right team. 

Can I download the code files for Early Access titles? Chevron down icon Chevron up icon

We try to ensure that all books in Early Access have code available to use, download, and fork on GitHub. This helps us be more agile in the development of the book, and helps keep the often changing code base of new versions and new technologies as up to date as possible. Unfortunately, however, there will be rare cases when it is not possible for us to have downloadable code samples available until publication.

When we publish the book, the code files will also be available to download from the Packt website.

How accurate is the publication date? Chevron down icon Chevron up icon

The publication date is as accurate as we can be at any point in the project. Unfortunately, delays can happen. Often those delays are out of our control, such as changes to the technology code base or delays in the tech release. We do our best to give you an accurate estimate of the publication date at any given time, and as more chapters are delivered, the more accurate the delivery date will become.

How will I know when new chapters are ready? Chevron down icon Chevron up icon

We'll let you know every time there has been an update to a course that you've bought in Early Access. You'll get an email to let you know there has been a new chapter, or a change to a previous chapter. The new chapters are automatically added to your account, so you can also check back there any time you're ready and download or read them online.

I am a Packt subscriber, do I get Early Access? Chevron down icon Chevron up icon

Yes, all Early Access content is fully available through your subscription. You will need to have a paid for or active trial subscription in order to access all titles.

How is Early Access delivered? Chevron down icon Chevron up icon

Early Access is currently only available as a PDF or through our online reader. As we make changes or add new chapters, the files in your Packt account will be updated so you can download them again or view them online immediately.

How do I buy Early Access content? Chevron down icon Chevron up icon

Early Access is a way of us getting our content to you quicker, but the method of buying the Early Access course is still the same. Just find the course you want to buy, go through the check-out steps, and you’ll get a confirmation email from us with information and a link to the relevant Early Access courses.

What is Early Access? Chevron down icon Chevron up icon

Keeping up to date with the latest technology is difficult; new versions, new frameworks, new techniques. This feature gives you a head-start to our content, as it's being created. With Early Access you'll receive each chapter as it's written, and get regular updates throughout the product's development, as well as the final course as soon as it's ready.We created Early Access as a means of giving you the information you need, as soon as it's available. As we go through the process of developing a course, 99% of it can be ready but we can't publish until that last 1% falls in to place. Early Access helps to unlock the potential of our content early, to help you start your learning when you need it most. You not only get access to every chapter as it's delivered, edited, and updated, but you'll also get the finalized, DRM-free product to download in any format you want when it's published. As a member of Packt, you'll also be eligible for our exclusive offers, including a free course every day, and discounts on new and popular titles.