In this chapter, you learned about model parameters, hyperparameters, and configuration space. Let's review them quickly:
- Model parameters: You can consider these as parameters to be learned during training time
- Model hyperparameters: These are the parameters that you should define before the training run starts
- Configuration space parameters: These parameters refer to any other parameter used for the environment that hosts your experiment
You have been introduced to common hyperparameter optimization methods, such as grid search and randomized search. Grid search and randomized search do not use the information produced from previous training runs and this is a disadvantage that Bayesian-based optimization methods address.
Bayesian-based optimization methods leverage the information of previous training runs to decide what will be the hyperparameter values for...