Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Machine Learning on Kubernetes

You're reading from   Machine Learning on Kubernetes A practical handbook for building and using a complete open source machine learning platform on Kubernetes

Arrow left icon
Product type Paperback
Published in Jun 2022
Publisher Packt
ISBN-13 9781803241807
Length 384 pages
Edition 1st Edition
Languages
Arrow right icon
Authors (2):
Arrow left icon
Ross Brigoli Ross Brigoli
Author Profile Icon Ross Brigoli
Ross Brigoli
Faisal Masood Faisal Masood
Author Profile Icon Faisal Masood
Faisal Masood
Arrow right icon
View More author details
Toc

Table of Contents (16) Chapters Close

Preface 1. Part 1: The Challenges of Adopting ML and Understanding MLOps (What and Why)
2. Chapter 1: Challenges in Machine Learning FREE CHAPTER 3. Chapter 2: Understanding MLOps 4. Chapter 3: Exploring Kubernetes 5. Part 2: The Building Blocks of an MLOps Platform and How to Build One on Kubernetes
6. Chapter 4: The Anatomy of a Machine Learning Platform 7. Chapter 5: Data Engineering 8. Chapter 6: Machine Learning Engineering 9. Chapter 7: Model Deployment and Automation 10. Part 3: How to Use the MLOps Platform and Build a Full End-to-End Project Using the New Platform
11. Chapter 8: Building a Complete ML Project Using the Platform 12. Chapter 9: Building Your Data Pipeline 13. Chapter 10: Building, Deploying, and Monitoring Your Model 14. Chapter 11: Machine Learning on Kubernetes 15. Other Books You May Enjoy

Understanding model inferencing with Seldon Core

In the previous chapter, you built the model. These models are built by data science teams to be used in production and serve the prediction requests. There are many ways to use a model in production, such as embedding the model with your customer-facing program, but the most common way is to expose the model as a REST API. The REST API can then be used by any application. In general, running and serving a model in production is called model serving.

However, once the model is in production, it needs to be monitored for performance and needs updating to meet the expected criteria. A hosted model solution enables you to not only serve the model but monitor its performance and generate alerts that can be used to trigger retraining of the model.

Seldon is a UK-based firm that created a set of tools to manage the model's life cycle. Seldon Core is an open source framework that helps expose ML models to be consumed as REST APIs...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime