Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Building LLM Powered  Applications

You're reading from   Building LLM Powered Applications Create intelligent apps and agents with large language models

Arrow left icon
Product type Paperback
Published in May 2024
Publisher Packt
ISBN-13 9781835462317
Length 342 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Author (1):
Arrow left icon
Valentina Alto Valentina Alto
Author Profile Icon Valentina Alto
Valentina Alto
Arrow right icon
View More author details
Toc

Table of Contents (16) Chapters Close

Preface 1. Introduction to Large Language Models FREE CHAPTER 2. LLMs for AI-Powered Applications 3. Choosing an LLM for Your Application 4. Prompt Engineering 5. Embedding LLMs within Your Applications 6. Building Conversational Applications 7. Search and Recommendation Engines with LLMs 8. Using LLMs with Structured Data 9. Working with Code 10. Building Multimodal Applications with LLMs 11. Fine-Tuning Large Language Models 12. Responsible AI 13. Emerging Trends and Innovations 14. Other Books You May Enjoy
15. Index

A decision framework to pick the right LLM

In previous paragraphs, we covered some of the most promising Large Language Models available in the market today. Now the question is: which one should I use within my applications?The truth is that there is no a straightforward answer to this question. There are many factors to consider when choosing a large language model (LLM) for your application. Those factors also need to be declined in two scenarios: proprietary and open-source LLMs.Below you can find some factors and trade-offs you might want to consider while choosing your LLMs:

  • Size and performance. We saw that complex models (that means, with high number of parameters) tend to have better performance, especially in terms of parametric knowledge and generalization capabilities. Nevertheless, the larger the model, the more computation and memory it requires to process the input and generate the output. Which can result in higher latency and, as we will see, in higher costs.
  • Cost and...
lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime