In this section, we will discuss ridge regression, which is another alternative regression technique to OLS. We will look at the basic idea of ridge regression, the hyperparameters it introduces, and how to use it in practice.
Ridge regression adds more constraints to a linear model when attempting to fit it. It's another technique that attempts to control overfitting in linear models, more so than OLS. The strength of regularization is controlled by a parameter, α. A larger α implies less tolerance to overfitting, while an α of 0 is OLS. In some ways, ridge regression and Bayesian ridge regression are equivalent; they mainly differ in their presentation. We will now perform ridge regression using the following steps:
- We will import the required functions, as follows:
- Now, we will load in our Boston dataset; I'm going to drop the...