Summary
We started the chapter with hyperparameter tuning. We described the three basic search algorithms that are used for hyperparameter tuning (grid search, random search, and Bayesian optimization) and introduced many tools you can integrate into your project. Out of the tools we listed, we covered Ray Tune as it supports distributed hyperparameter tuning and implements many of the state-of-the-art search algorithms out of the box.
Then, we discussed Explainable AI. We explained the most standard techniques (PFI, FI, SHAP, and LIME) and how they can be used to find out how a model's behavior changes with respect to each feature in a dataset.
In the next chapter, we will shift our focus toward deployment. We will learn about ONNX, an open format for ML models, and look at how to convert a TF or PyTorch model into an ONNX model.