Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases now! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Azure Machine Learning Engineering

You're reading from   Azure Machine Learning Engineering Deploy, fine-tune, and optimize ML models using Microsoft Azure

Arrow left icon
Product type Paperback
Published in Jan 2023
Publisher Packt
ISBN-13 9781803239309
Length 362 pages
Edition 1st Edition
Tools
Arrow right icon
Authors (4):
Arrow left icon
Balamurugan Balakreshnan Balamurugan Balakreshnan
Author Profile Icon Balamurugan Balakreshnan
Balamurugan Balakreshnan
Dennis Michael Sawyers Dennis Michael Sawyers
Author Profile Icon Dennis Michael Sawyers
Dennis Michael Sawyers
Sina Fakhraee Ph.D Sina Fakhraee Ph.D
Author Profile Icon Sina Fakhraee Ph.D
Sina Fakhraee Ph.D
Megan Masanz Megan Masanz
Author Profile Icon Megan Masanz
Megan Masanz
Arrow right icon
View More author details
Toc

Table of Contents (17) Chapters Close

Preface 1. Part 1: Training and Tuning Models with the Azure Machine Learning Service
2. Chapter 1: Introducing the Azure Machine Learning Service FREE CHAPTER 3. Chapter 2: Working with Data in AMLS 4. Chapter 3: Training Machine Learning Models in AMLS 5. Chapter 4: Tuning Your Models with AMLS 6. Chapter 5: Azure Automated Machine Learning 7. Part 2: Deploying and Explaining Models in AMLS
8. Chapter 6: Deploying ML Models for Real-Time Inferencing 9. Chapter 7: Deploying ML Models for Batch Scoring 10. Chapter 8: Responsible AI 11. Chapter 9: Productionizing Your Workload with MLOps 12. Part 3: Productionizing Your Workload with MLOps
13. Chapter 10: Using Deep Learning in Azure Machine Learning 14. Chapter 11: Using Distributed Training in AMLS 15. Index 16. Other Books You May Enjoy

Summary

In this chapter, we have explored what model parameters are and how a sweep job can be leveraged to tune hyperparameters that are defined for a given model. We have also explored options for setting up sweep jobs based on the search space and sampling methodology selected. AMLS provides the ability to sweep across the search space to tune a model, automating the process of hyperparameter tuning on a compute cluster, which will shut itself down in the idle period after the trials are completed, consuming compute resources wisely.

In addition to setting up a sweep job, you have been able to review your results in the Studio as well as in the code – providing intuitive insight into the best-performing model for your use case. Now that you have completed the chapter, be sure to turn off your compute resources to save cost.

In the next chapter, we will show you how to leverage AMLS to take over the time-consuming task of model development. This functionality is not...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime