Linear regression
Although simplistic, linear regression should have a prominent place in your machine learning toolbox. The term regression is usually associated with the concept of fitting a model to data and minimizing the error between the expected and predicted values by computing the sum of square errors, residual sum of square errors, or least square errors.
Least square problems fall into two broad categories:
Ordinary least squares
Non-linear least squares
Univariate linear regression
Let's start with the simplest form of linear regression, which is single variable regression, in order to introduce the terms and concepts behind linear regression. In its simplest interpretation, one variate linear regression consists of fitting a line to a set of data points {x, y}.
Note
M1: This is a single variable linear regression for a model f, with weights wj for features xj, and labels (or expected values) yj:
![](https://static.packt-cdn.com/products/9781787122383/graphics/B06240_09_01.jpg)
Here, w1 is the slope, w0 is the intercept, f is the linear function that minimizes the...