K-Nearest Neighbors
In our previous efforts, we built models that had coefficients or, said another way, parameter estimates for each of our included features. With KNN, we have no parameters as the learning method is the so-called instance-based learning. In short, The labeled examples (inputs and corresponding output labels) are stored and no action is taken until a new input pattern demands an output value. (Battiti and Brunato, 2014, p. 11). This method is commonly called lazy learning as no specific model parameters are produced. The train
instances themselves represent the knowledge. For the prediction of any new instance (a new data point), the train
data is searched for an instance that most resembles the new instance in question. KNN does this for a classification problem by looking at the closest points—the nearest neighbors to determine the proper class. The k comes into play by determining how many neighbors should be examined by the algorithm, so if k=5, it will examine...