Classification with support vector machines
The support vector machines (SVM) can be used for regression that is support vector regression (SVR) and support vector classification (SVC). The algorithm was invented by Vladimir Vapnik in 1993 (see http://en.wikipedia.org/wiki/Support_vector_machine). SVM maps data points to points in multidimensional space. The mapping is performed by a so-called kernel function. The kernel function can be linear or nonlinear. The classification problem is then reduced to finding a hyperplane or hyperplanes that best separate the points into classes. It can be hard to perform the separation with hyperplanes, which lead to the emergence of the concept of the soft margin. The soft margin measures the tolerance for misclassification and is governed by a constant commonly denoted with C. Another important parameter is the type of the kernel function, which can be one of the following:
- A linear function
- A polynomial function
- A radial basis function
- A sigmoid function...