Summary
This chapter introduced Optuna as a framework for HPO. We discussed the problems of finding optimal hyperparameters and how HPO algorithms may be used to find suitable parameters efficiently.
We discussed two optimization algorithms available in Optuna: TPE and CMA-ES. Both algorithms allow a user to set a specific budget for optimization (the number of trials to perform) and proceed to find suitable parameters within the constraints. Further, we discussed the pruning of unpromising optimization trials to save additional resources and time. Median pruning and the more complex but effective pruning techniques of successive halving and Hyperband were discussed.
We then proceeded to show how to perform HPO studies for LightGBM in a practical example. We also showed advanced features of Optuna that can be used to save and resume studies, understand the effects of parameters, and perform MOO.
The next chapter focuses on two case studies using LightGBM, where the data science...