The stochastic gradient descent (SGD) is a fundamental technique used to fit a model for regression. There are natural connections between SGD for classification or regression.
Using SGD for classification
Getting ready
In regression, we minimized a cost function that penalized for bad choices on a continuous scale, but for classification, we'll minimize a cost function that penalizes for two (or more) cases.
How to do it...
- First, let's create some very basic data:
from sklearn import datasets
X, y = datasets.make_classification(n_samples = 500)
- Split the...