Hands-on deployment (for the business problem)
In this section, we will learn how to deploy solutions for the business problem we have been working on. So far, we have done data processing, ML model training, serialized models, and registered them to the Azure ML workspace. In this section, we will explore how inference is performed on the serialized model on a container and an auto-scaling cluster. These deployments will give you a broad understanding and will prepare you well for your future assignments.
We will use Python as the primary programming language, and Docker and Kubernetes for building and deploying containers. We will start with deploying a REST API service on an Azure container instance using Azure ML. Next, we will deploy a REST API service on an auto-scaling cluster using Kubernetes (for container orchestration) using Azure ML, and lastly, we will deploy on an Azure container instance using MLflow and an open source ML framework; this way, we will learn how to...