The most commonly used tool for hyperparameter tuning is grid search, which is basically a fancy term for saying we will try all possible parameter combinations with a for loop.
Let's have a look at how that is done in practice.
The most commonly used tool for hyperparameter tuning is grid search, which is basically a fancy term for saying we will try all possible parameter combinations with a for loop.
Let's have a look at how that is done in practice.
Returning to our k-NN classifier, we find that we have only one hyperparameter to tune: k. Typically, you would have a much larger number of open parameters to mess with, but the k-NN algorithm is simple enough for us to manually implement grid search.
Before we get started, we need to split the dataset as we have done before into training and test sets. Here we choose a 75-25 split:
In [1]: from sklearn.datasets import...