Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Database Design and Modeling with Google Cloud

You're reading from   Database Design and Modeling with Google Cloud Learn database design and development to take your data to applications, analytics, and AI

Arrow left icon
Product type Paperback
Published in Dec 2023
Publisher Packt
ISBN-13 9781804611456
Length 234 pages
Edition 1st Edition
Concepts
Arrow right icon
Author (1):
Arrow left icon
Abirami Sukumaran Abirami Sukumaran
Author Profile Icon Abirami Sukumaran
Abirami Sukumaran
Arrow right icon
View More author details
Toc

Table of Contents (18) Chapters Close

Preface 1. Part 1:Database Model: Business and Technical Design Considerations
2. Chapter 1: Data, Databases, and Design FREE CHAPTER 3. Chapter 2: Handling Data on the Cloud 4. Part 2:Structured Data
5. Chapter 3: Database Modeling for Structured Data 6. Chapter 4: Setting Up a Fully Managed RDBMS 7. Chapter 5: Designing an Analytical Data Warehouse 8. Part 3:Semi-Structured, Unstructured Data, and NoSQL Design
9. Chapter 6: Designing for Semi-Structured Data 10. Chapter 7: Unstructured Data Management 11. Part 4:DevOps and Databases
12. Chapter 8: DevOps and Databases 13. Part 5:Data to AI
14. Chapter 9: Data to AI – Modeling Your Databases for Analytics and ML 15. Chapter 10: Looking Ahead – Designing for LLM Applications 16. Index 17. Other Books You May Enjoy

Getting started with LLMs

Throughout this chapter, we will cover components and terminologies around LLMs and concepts that are crucial for data modeling for LLM-based applications. However, the detailed architecture involved in creating LLM-based applications is outside the scope of this chapter. Here is an overview of the architecture and functioning of LLMs, which are typically composed of three main components:

  • Encoder: The encoder is responsible for converting the input text into a sequence of numbers. This is done by representing each word in the input text as a vector of numbers.
  • Decoder: The decoder is responsible for generating the output text from the sequence of numbers. This is done by predicting the next word in the output text, given the previous words.
  • Transformer: The transformer is a neural network that is used to train the encoder and decoder. It can learn long-range dependencies between words.

To give a high-level summary, LLMs work by learning...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime