9. Hyperparameter tuning and Automated Machine Learning
In the previous chapter, we learned how to train convolutional and more complex deep neural networks (DNNs). When training these models, we are often confronted with complex choices when parametrizing them, involving various parameters such as the number of layers, the order of layers, regularization, batch size, learning rate, the number of epochs, and more. This is not only true for DNNs; the same problem arises with selecting the correct preprocessing steps, features, models, and parameters in statistical ML approaches.
In this chapter, we will take a look at optimizing the training process in order to take away some of those error-prone human choices from machine learning. These necessary tuning tricks will help you to train better models faster and more efficiently. First, we will take a look at hyperparameter tuning (also called HyperDrive in Azure Machine Learning), a standard technique for optimizing all parameter...