Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Machine Learning Engineering with MLflow

You're reading from   Machine Learning Engineering with MLflow Manage the end-to-end machine learning life cycle with MLflow

Arrow left icon
Product type Paperback
Published in Aug 2021
Publisher Packt
ISBN-13 9781800560796
Length 248 pages
Edition 1st Edition
Tools
Arrow right icon
Author (1):
Arrow left icon
Natu Lauchande Natu Lauchande
Author Profile Icon Natu Lauchande
Natu Lauchande
Arrow right icon
View More author details
Toc

Table of Contents (18) Chapters Close

Preface 1. Section 1: Problem Framing and Introductions
2. Chapter 1: Introducing MLflow FREE CHAPTER 3. Chapter 2: Your Machine Learning Project 4. Section 2: Model Development and Experimentation
5. Chapter 3: Your Data Science Workbench 6. Chapter 4: Experiment Management in MLflow 7. Chapter 5: Managing Models with MLflow 8. Section 3: Machine Learning in Production
9. Chapter 6: Introducing ML Systems Architecture 10. Chapter 7: Data and Feature Management 11. Chapter 8: Training Models with MLflow 12. Chapter 9: Deployment and Inference with MLflow 13. Section 4: Advanced Topics
14. Chapter 10: Scaling Up Your Machine Learning Workflow 15. Chapter 11: Performance Monitoring 16. Chapter 12: Advanced Topics with MLflow 17. Other Books You May Enjoy

Creating an API process for inference

The code required for this section is in the pystock-inference-api folder. The MLflow infrastructure is provided in the Docker image accompanying the code as shown in the following figure:

Figure 9.2 – The structure of the API job

Setting up an API system is quite easy by relying on the MLflow built-in REST API environment. We will rely on the artifact store on the local filesystem to test the APIs.

With the following set of commands, which at its core consists of using the models serve command in the CLI, we can serve our models:

cd /gradflow/
export MLFLOW_TRACKING_URI=http://localhost:5000
mlflow models serve -m "models:/training-model-psystock/Production" -p 6000

We next will package the preceding commands in a Docker image so it can be used on any environment for deployment. The steps to achieve this are the following:

  1. Generate a Docker image specifying the work directory and the...
lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at AU $24.99/month. Cancel anytime