Reviewing hyperparameter tuning techniques and strategies
Hyperparameter tuning or hyperparameter optimization is a technique that ML professionals use to determine the best parameters to solve a specific ML problem. In different problems, you'd need to tune different types of parameters, such as weights in neural networks, or the number of trees in the Random Forest algorithm, or the learning rate of your model. Ultimately, selecting the best parameters helps you determine which method is best to solve a problem. A data scientist needs to understand the tunable parameters in the algorithm they use to be able to optimize them correctly.
There are a number of ML algorithms that help solve the hyperparameter optimization problem. Let's review the most common ones.
Grid search
Grid search is the simplest algorithm and is sometimes called a brute-force approach to hyperparameter optimization. This method calculates the optimum values of hyperparameters.
In Grid search...