Hyperparameter optimization
How do we know what kinds of hyperparameters to use and what their values should be? Hyperparameters can be chosen based on domain knowledge, experience, or trial and error, but to most efficiently choose the best hyperparameters, we can use a process called hyperparameter optimization, or hyperparameter tuning, which is a systematic process that can be implemented via different mechanisms that we will discuss next. Ultimately, the goal of hyperparameter optimization is to tune the hyperparameters of a model to achieve the best performance as measured by running it against a validation set, which is a subset of our source dataset.
Methods for optimizing hyperparameter values
In Chapter 2, we described hyperparameter tuning mechanisms such as grid search, random search, and Bayesian optimization, summarized here as a quick refresher:
- Grid search: This is an exhaustive search of the entire hyperparameter space (i.e., it tries out every possible...