Chapter 12: Operationalizing Models with Code
In this chapter, you are going to learn how to operationalize the machine learning models you have been training, in this book, so far. You will explore two approaches: exposing a real-time endpoint by hosting a REST API that you can use to make inferences and expanding your pipeline authoring knowledge to make inferences on top of big data, in parallel, efficiently. You will begin by registering a model in the workspace to keep track of the artifact. Then, you will publish a REST API; this is something that will allow your model to integrate with third-party applications such as Power BI. Following this, you will author a pipeline to process half a million records within a couple of minutes in a very cost-effective manner.
In this chapter, we are going to cover the following topics:
- Understanding the various deployment options
- Registering models in the workspace
- Deploying real-time endpoints
- Creating a batch inference...