Gradient descent variants
The workings of the gradient descent algorithm to optimize a simple linear regression model (y = mx + c) is elaborated with Python code in this section.
Application of gradient descent
Keeping the number of iterations the same, the algorithm is run for three different learning rates resulting in three models, hence three MSE (mean squared error) values. MSE is the calculated loss or cost function in linear regression:
import numpy as np import matplotlib.pyplot as plt from sklearn.metrics import mean_squared_error #gradient descent method class GDLinearRegression: def __init__(self, learning_rate, epoch): self.learning_rate, self.iterations = learning_rate, epoch #epoch is number of iterations def fit(self, X, y): c = 0 ...