Deploying an MLflow model with managed online endpoints through AML Studio
In order to deploy a model to a web service, we will be required to define the environment, which includes the Conda and pip
dependencies, our compute resources, and a scoring script. The scoring script, also called an entry script, will load the model in an initialization function, as well as handle running predictions with the incoming data to the web service.
With MLflow models, not only is the model packaged but AML also understands how to consume the model, so there is no need to configure an environment or entry script for the model deployment with managed online endpoints; AML understands these models natively. This makes deploying the model very easy from the UI and through code.
In previous chapters, we leveraged MLflow to create and register models. Proceed to the Chapter 6
, Prep-Model Creation & Registration.ipynb
notebook to create and register a model to leverage MLflow, as we did in...