Support Vector Machines
Support Vector Machines (SVMs) are a set of supervised learning techniques for classification and regression (and also for outlier detection), which is quite versatile as it can fit both linear and nonlinear models thanks to the availability of special functions—kernel functions. The specialty of such kernel functions is to be able to map the input features into a new, more complex feature vector using a limited amount of computations. Kernel functions nonlinearly recombine the original features, making possible the mapping of the response by very complex functions. In such a sense, SVMs are comparable to neural networks as universal approximators, and thus can boast a similar predictive power in many problems.
Contrary to the linear models seen in the previous chapter, SVMs started as a method to solve classification problems, not regression ones.
SVMs were invented at the AT&T laboratories in the '90s by the mathematician, Vladimir Vapnik, and computer...