Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases now! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Building AI Intensive Python Applications

You're reading from   Building AI Intensive Python Applications Create intelligent apps with LLMs and vector databases

Arrow left icon
Product type Paperback
Published in Sep 2024
Publisher Packt
ISBN-13 9781836207252
Length 298 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Toc

Table of Contents (18) Chapters Close

Preface 1. Chapter 1: Getting Started with Generative AI FREE CHAPTER 2. Chapter 2: Building Blocks of Intelligent Applications 3. Part 1: Foundations of AI: LLMs, Embedding Models, Vector Databases, and Application Design
4. Chapter 3: Large Language Models 5. Chapter 4: Embedding Models 6. Chapter 5: Vector Databases 7. Chapter 6: AI/ML Application Design 8. Part 2: Building Your Python Application: Frameworks, Libraries, APIs, and Vector Search
9. Chapter 7: Useful Frameworks, Libraries, and APIs 10. Chapter 8: Implementing Vector Search in AI Applications 11. Part 3: Optimizing AI Applications: Scaling, Fine-Tuning, Troubleshooting, Monitoring, and Analytics
12. Chapter 9: LLM Output Evaluation 13. Chapter 10: Refining the Semantic Data Model to Improve Accuracy 14. Chapter 11: Common Failures of Generative AI 15. Chapter 12: Correcting and Optimizing Your Generative AI Application 16. Other Books You May Enjoy Appendix: Further Reading: Index

The generative AI stack

A stack combines tools, libraries, software, and solutions to create a unified and integrated approach. The GenAI stack includes programming languages, LLM providers, frameworks, databases, and deployment solutions. Though the GenAI stack is relatively new, it already has many variations and options for engineers to choose from.

Let’s discuss what you need to build a functional GenAI application. The bare minimum requirements are the following, as also shown in Figure 1.2:

  • An operating system: Usually, this is Unix/Linux based.
  • A storage layer: An SQL or NoSQL database. This book uses MongoDB.
  • A vector database capable of storing embeddings: This book uses MongoDB, which stores its embeddings within your data or content, rather than in a separate database.
  • A web server: Apache and Nginx are quite popular.
  • A development environment: This could be Node.js/JavaScript, .NET, Java, or Python. This book uses Python throughout the examples with a bit of JavaScript where needed.

Figure 1.2: A basic GenAI stack

If you want to learn more about the AI stack, you can find detailed information at www.mongodb.com/resources/basics/ai-stack.

Python and GenAI

Python was conceived in the late 1980s by Guido van Rossum and officially released in 1991. Over the decades, Python has evolved into a versatile language, beloved by developers for its clean syntax and robust functionality. It has a clean syntax that is easy to understand, making it an ideal choice for beginner developers.

Although it is not entirely clear why, fairly early on, the Python ecosystem began introducing more libraries and frameworks that were tailored to ML and data science. Libraries and frameworks such as TensorFlow, Keras, PyTorch, and scikit-learn provided powerful tools for developers in these fields. Analysts who were less technical were still able to get started with Python with relative ease. Due to its interoperability, Python seamlessly integrated with other programming languages and technologies, making it easier to integrate with data pipelines and web applications.

GenAI, with its demands for high computational power and sophisticated algorithms, finds a perfect partner in Python. Here are some examples that readily come to mind:

  • Libraries such as Pandas and NumPy allow efficient manipulation and analysis of large datasets, a fundamental step in training generative models
  • Frameworks such as TensorFlow and PyTorch offer pre-built components to design and train complex neural networks
  • Tools such as Matplotlib and Seaborn enable detailed visualization of data and model outputs, aiding in understanding and refining AI models
  • Frameworks such as Flask and FastAPI make deploying your GenAI models as scalable web services straightforward

Python has a rich ecosystem that is easy to use and allows you to quickly get started, making it an ideal programming language for GenAI projects. Now, let’s talk more about the other pieces of technology you’ll be using throughout the rest of the book.

OpenAI API

The first, and most important, tool of this book is the OpenAI API. In the following chapters, you’ll learn more about each component of the GenAI stack—and the most critical to be familiar with is OpenAI. While we’ll cover other LLM providers, the one used in our examples and code repository will be OpenAI.

The OpenAI API, launched in mid-2020, provides developers with access to their powerful models, allowing integration of advanced NLP capabilities into applications. Through this API, developers gain access to some of the most advanced AI models in existence, such as GPT-4. These models are trained on vast datasets and possess unparalleled capabilities in natural language understanding and response generation.

Moreover, OpenAI’s infrastructure is built to scale. As your project grows and demands more computational power, OpenAI ensures that you can scale effortlessly without worrying about the underlying hardware or system architecture. OpenAI’s models excel at NLP tasks, including text generation, summarization, translation, and sentiment analysis. This can be invaluable for creating content, chatbots, virtual assistants, and more.

Much of the data from the internet and internal conversations and documentation is unstructured. OpenAI, as a company, has used that data to train an LLM, and then offered that LLM as a service, making it possible for you to create interactive GenAI applications without hosting or training your own LLM. You’ll learn more about LLMs in Chapter 3, Large Language Models.

MongoDB with Vector Search

Much has been said about how MongoDB serves the use case of unstructured data but that the world’s data is fundamentally relational. It can be argued that no data is meaningful until humans deem it so, and that the relationships and structure of that data are determined by humans as well. For example, several years ago, a researcher at a leading space exploration company made this memorable comment in a meeting:

We scraped text content from websites and PDF documents primarily, and we realized it didn’t really make sense to try and cram that data into a table.”

MongoDB thrives with the messy, unstructured content that characterizes the real world—.txt files, Markdown, PDFs, HTML, and so on. MongoDB is flexible enough to have the structure that engineers deem is best suited for purpose, and because of that flexibility, it is a great fit for GenAI use cases.

For that reason, it is much easier to use a document database for GenAI than it is to use a SQL database.

Another reason to use MongoDB is for its vector search capabilities. Vector search ensures that when you store a phrase in MongoDB, it converts that data into an array. This is called a vector. Vectors are numerical representations of data and their context, as shown in Figure 1.3. The number of these dimensions is referred to as an embedding, and the more of them you have, the better off you are.

Figure 1.3: Example of a vector

After you’ve created embeddings for a piece of data, a mathematical process will identify which vectors are closest or nearest to each other, and you can then infer that the data is related. This allows you to return related words instead of only exact matches. For instance, if you are looking for pets, you could find cats, dogs, parakeets, and hamsters—even though those terms are not the exact word pets. Vectors are what allow you to receive results that are related in meaning or context or are alike, without being an exact match.

MongoDB stores your data embeddings alongside the data itself. Storing the embeddings together makes the consequent queries faster. It is easiest to visualize vector search via an example with explanations of how it works along the way. You will learn more about vector search in Chapter 8, Implementing Vector Search in AI Applications.

You have been reading a chapter from
Building AI Intensive Python Applications
Published in: Sep 2024
Publisher: Packt
ISBN-13: 9781836207252
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime