An overview of different execution scenarios and environments
In our previous chapters, we mainly focused on learning how to track DL pipelines using MLflow's tracking capabilities. Most of our execution environments are in a local environment, such as a local laptop or desktop environment. However, as we already know, the DL full life cycle consists of different stages where we may need to run the DL pipelines either entirely, partially, or as a single step in a different execution environment. Here are two typical examples:
- When accessing data for model training purposes, it is not uncommon to require the data to reside in an enterprise-security and privacy-compliant environment, where both the computation and the storage cannot leave a compliant boundary.
- When training a DL model, it is usually desirable to use a remote GPU cluster to maximize the efficiency of model training, where a local laptop usually does not have the required hardware capability.
Both...