In the previous recipe, we saw that Ridge Regression gives us much more stable coefficients, at the cost of a small bias (the coefficients are compressed to a smaller size than they should). It is based on the L2 regularization norm, which is essentially the squared sum of the coefficients. In order to do that, we used the glmnet package, which allows us to decide how much Ridge/Lasso regularization we want.
Working with LASSO
Getting ready
Lets install same packages as in the previous recipe: glmnet, ggplot2, tidyr, and MASS. They can be installed via install.packages().