Chapter 12. Kernel Models and SVM
In the Binomial classification section of Chapter 9, Regression and Regularization, you learned the concept of hyperplanes that segregate observations into two classes. These hyperplanes are also known as linear decision boundaries. In the case of the logistic regression, the datasets must be linearly separated. This constraint is particularly an issue for problems with many features that are nonlinearly dependent (high dimension models).
Support vector machines (SVMs) overcome this limitation by estimating the optimal separating hyperplane using kernel functions.
This chapter introduces kernel functions; binary support vectors classifiers, one-class SVMs for anomaly detection, and support vector regression.
In this chapter, you will answer the following questions:
- What is the purpose of kernel functions?
- What is the concept behind the maximization of margin?
- What is the impact of some of the SVM configuration parameters and the kernel method on the...