The most commonly used tool for hyperparameter tuning is grid search, which is basically a fancy term for saying we will try all possible parameter combinations with a for loop.
Let's have a look at how that is done in practice.
The most commonly used tool for hyperparameter tuning is grid search, which is basically a fancy term for saying we will try all possible parameter combinations with a for loop.
Let's have a look at how that is done in practice.
Returning to our kNN classifier, we find that we have only one hyperparameter to tune: k. Typically, you would have a much larger number of open parameters to mess with, but the kNN algorithm is simple enough for us to manually implement a grid search.
Before we get started, we need to split the dataset as we have done before into training and test sets:
In [1]: from sklearn.datasets import...