Summary
This chapter demonstrated how easy it is to use KNN for binary or multiclass classification. Since KNN does not make assumptions about normality or linearity, it can be used in cases where logistic regression may not yield the best results. This flexibility does bring with it a real risk of overfitting, so care has to be taken with the choice of k. We also explored how to hyperparameter tune both binary and multiclass models in this chapter. Finally, KNN is not a great option when we care about the speed of our predictions or if we are working with a large dataset. Decision tree or random forest classification, which we explored in the previous chapter, is often a better choice in those cases.
Another really good choice is support vector classification. We will explore support vector classification in the next chapter.