Optuna and optimization algorithms
Examples from previous chapters have shown that choosing the best hyperparameters for a problem is critical in solving a machine learning problem. The hyperparameters significantly impact the algorithm’s performance and generalization capability. The optimal parameters are also specific to the model used and the learning problem being solved.
Other issues complicating hyperparameter optimization are as follows:
- Cost: For each unique set of hyperparameters (of which there can be many), an entire training run, often with cross-validation, must be performed. This is highly time-consuming and computationally expensive.
- High-dimensional search spaces: Each parameter can have a vast range of potential values, making testing each value impossible.
- Parameter interaction: Optimizing each parameter in isolation is often impossible, as some parameters’ values interact with others’ values. A good example is the learning...