Before ending this chapter, it is useful to highlight some additional linear classification algorithms:
- SGD is a versatile solver. As mentioned earlier, it can perform a logistic regression classification in addition to SVM and perceptron classification, depending on the loss function used. It also allows regularized penalties.
- The rideclassifier converts class labels into 1 and -1 and treats the problem as a regression task. It also deals well with non-binary classification tasks. Due to its design, it uses a different set of solvers, so it's worth trying as it may be quicker to learn when dealing with a large number of classes.
- LinearSupportVectorClassification (LinearSVC) is another linear model. Rather than log loss, it uses the hinge function, which aims to find class boundaries where the samples of each class are as far as possible from the boundaries. This is not to be confused with...