Integrating ML Systems in Ecosystems
ML systems have gained a lot of popularity for two reasons – their ability to learn from data (which we’ve explored throughout this book), and their ability to be packaged into web services.
Packaging these ML systems into web services allows us to integrate them into workflows in a very flexible way. Instead of compiling or using dynamically linked libraries, we can deploy ML components that communicate over HTTP protocols using JSON protocols. We have already seen how to use that protocol by using the GPT-3 model that is hosted by OpenAI. In this chapter, we’ll explore the possibility of creating a Docker container with a pre-trained ML model, deploying it, and integrating it with other components.
In this chapter, we’re going to cover the following main topics:
- ML system of systems – software ecosystems
- Creating web services over ML models using Flask
- Deploying ML models using Docker ...