Bayesian Optimization
One of the major trade-offs within grid search and random search is that both techniques do not keep track of the past evaluations of hyperparameter combinations used for the model training. Ideally, if there was some artificial intelligence were induced in this path that could indicate the process with the historic performance on the selected list of hyperparameters and a mechanism to improve performance by advancing iterations in the right direction, it would drastically reduce the number of iterations required to find the optimal set of values for the hyperparameters. Grid search and random search, however, miss on this front and iterate through all provided combinations without considering any cues from previous iterations.
With Bayesian optimization, we overcome this trade-off by enabling the tuning process to keep track of previous iterations and their evaluation by developing a probabilistic model that would map the hyperparameters to a probability score of the...