The third regression algorithm that we want to explore is support vector regression (SVR). As the name implies, SVR is part of the support vector family and sibling of the support vector machine (SVM) for classification (or we can just call it SVC) we learned in Chapter 5, Classifying Newsgroup Topic with Support Vector Machine.
To recap, SVC seeks an optimal hyperplane that best segregates observations from different classes. Suppose a hyperplane is determined by a slope vector w and intercept b, the optimal hyperplane is picked so that the distance (which can be expressed as ) from its nearest points in each of segregated spaces to the hyperplane itself is maximized. Such optimal w and b can be learned and solved by the following optimization problem:
- Minimizing ‖w‖
- Subject to wx(i)+b≥1 if y(i)=1 and wx(i)+b≤...