Regularizing with lasso regression
Lasso regression stands for Least Absolute Shrinkage and Selection Operator. This is a regularization method that is conceptually very close to ridge regression. In some cases, lasso regression outperforms ridge regression, which is why it’s useful to know what it does and how to use it. In this recipe, we will briefly explain what lasso regression is and then train a model using scikit-learn on the same California housing dataset.
Getting ready
Instead of using the L2-norm, lasso uses the L1-norm, so that the loss is the following:
While ridge regression tends to decrease weights close to zero quite smoothly, lasso is more drastic. Lasso, having a much steeper loss, tends to set weights to zero quite quickly.
Just like the ridge regression recipe, we’ll use the same libraries and assume they are installed: numpy
, sklearn
, and matplotlib
. Also, we’ll assume the data is already downloaded and...