Stochastic gradient descent - adult income
Stochastic gradient descent also known as incremental gradient descent, is a stochastic approximation of the gradient descent optimization method for minimizing an objective function that is written as a sum of differentiable functions. It tries to find minima or maxima by iteration. In stochastic gradient descent, the true gradient of Q(w) is approximated by a gradient at a single example:
As the algorithm sweeps through the training set, it performs the above update for each training example. Several passes can be made over the training set until the algorithm converges. If this is done, the data can be shuffled for each pass to prevent cycles. Typical implementations may use an adaptive learning rate so that the algorithm converges.
Getting ready
In order to perform stochastic gradient descent, we will be using a dataset collected from census data to predict income.
Step 1 - collecting and describing the data
The dataset titled adult.txt
will be...