Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Unlocking Data with Generative AI and RAG

You're reading from   Unlocking Data with Generative AI and RAG Enhance generative AI systems by integrating internal data with large language models using RAG

Arrow left icon
Product type Paperback
Published in Sep 2024
Publisher Packt
ISBN-13 9781835887905
Length 346 pages
Edition 1st Edition
Concepts
Arrow right icon
Author (1):
Arrow left icon
Keith Bourne Keith Bourne
Author Profile Icon Keith Bourne
Keith Bourne
Arrow right icon
View More author details
Toc

Table of Contents (20) Chapters Close

Preface 1. Part 1 – Introduction to Retrieval-Augmented Generation (RAG) FREE CHAPTER
2. Chapter 1: What Is Retrieval-Augmented Generation (RAG) 3. Chapter 2: Code Lab – An Entire RAG Pipeline 4. Chapter 3: Practical Applications of RAG 5. Chapter 4: Components of a RAG System 6. Chapter 5: Managing Security in RAG Applications 7. Part 2 – Components of RAG
8. Chapter 6: Interfacing with RAG and Gradio 9. Chapter 7: The Key Role Vectors and Vector Stores Play in RAG 10. Chapter 8: Similarity Searching with Vectors 11. Chapter 9: Evaluating RAG Quantitatively and with Visualizations 12. Chapter 10: Key RAG Components in LangChain 13. Chapter 11: Using LangChain to Get More from RAG 14. Part 3 – Implementing Advanced RAG
15. Chapter 12: Combining RAG with the Power of AI Agents and LangGraph 16. Chapter 13: Using Prompt Engineering to Improve RAG Efforts 17. Chapter 14: Advanced RAG-Related Techniques for Improving Results 18. Index 19. Other Books You May Enjoy

Fundamentals of vectors in RAG

In this section, we will cover several important topics related to vectors and embeddings in the context of natural language processing (NLP) and RAG. We will begin by clarifying the relationship between vectors and embeddings, explaining that embeddings are a specific type of vector representation used in NLP. We then discuss the properties of vectors, such as their dimensions and size, and how these characteristics impact the precision and effectiveness of text search and similarity comparisons.

What is the difference between embeddings and vectors?

Vectors and embeddings are key concepts in NLP and play a crucial role in building language models and RAG systems. But what are they and how do they relate to each other? To put it simply, you can think of embeddings as a specific type of vector representation. When we are talking about the large language models (LLMs) we use in RAG, which are part of a larger universe called NLP, the vectors we use...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime