Tuning hyperparameters with AutoML
In machine learning and deep learning, hyperparameter tuning is the process in which we select a set of optimal hyperparameters that will be used by our learning algorithm. Here, hyperparameters are values that are used to control the learning process. In contrast, other parameters will be learned from the data. In this sense, a hyperparameter is a concept that follows its statistical meaning; that is, it's a parameter from a prior distribution that captures the prior belief before we start to learn from the data.
In machine learning and deep learning, it is also common to call hyperparameters the parameters that are set before we start to train our model. These parameters will control the training process. Some examples of hyperparameters that are used in deep learning are as follows:
- Learning rate
- Number of epochs
- Hidden layers
- Hidden units
- Activation functions
These parameters will directly influence the performance...