Finding the separating boundary with SVM
SVM is another great classifier, which is effective in cases with high-dimensional spaces or where the number of dimensions is greater than the number of samples.
In machine learning classification, SVM finds an optimal hyperplane that best segregates observations from different classes.
A hyperplane is a plane of n - 1 dimensions that separates the n-dimensional feature space of the observations into two spaces. For example, the hyperplane in a two-dimensional feature space is a line, and in a three-dimensional feature space, the hyperplane is a surface. The optimal hyperplane is picked so that the distance from its nearest points in each space to itself is maximized, and these nearest points are the so-called support vectors.
The following toy example demonstrates what support vectors and a separating hyperplane (along with the distance margin, which I will explain later) look like in a binary classification case:
Figure...