Summary
Implementing models in production can be challenging, but with tools designed to support productionizing models and automating the MLOps process, it is much easier. In this chapter, we looked at using the UC Model Registry to manage the life cycle of an ML model. We highlighted MLflow and how it can be used to create reproducible, modularized data science workflows that automatically track parameters and performance metrics. We also discussed techniques for calculating features at the time of inference. To make the end-to-end MLOps process more manageable, we showed how to use workflows and webhooks to automate the ML life cycle. We also showed how to serve models and make inferences using MLflow and the Databricks platform.
In the last chapter, Monitoring, Evaluating, and More, we will look at monitoring our data and ML models within the Databricks Lakehouse so that you can get the most value from your data.