What is HPT and why is it important?
Hyperparameter tuning, or HPT for short, is a popular model optimization technique that is very commonly used across ML projects. In this section, we will learn about hyperparameters, the importance of tuning them, and different methods of finding the best hyperparameters for a machine learning algorithm.
What are hyperparameters?
When we train an ML system, we basically have three kinds of data – input data, model parameters, and model hyperparameters. Input data refers to our training or test data that is associated with the problem we are solving. Model parameters are the variables that we modify during the model training process and we try to adjust them to fit the training data. Model hyperparameters, on the other hand, are variables that govern the training process itself. These hyperparameters are fixed before we start to train our model. For example, learning rate, optimizer, batch size, number of hidden layers in a neural network...