Let's adapt the preceding example to use ElasticNets. Using scikit-learn, it is very easy to swap in the ElasticNet regressor for the least squares one that we had before:
from sklearn.linear_model import Lasso
las = Lasso(alpha=0.5)
Now we use las, whereas earlier we used lr. This is the only change that is needed. The results are exactly what we would expect. When using Lasso, the R2 on the training data decreases to 0.71 (it was 0.74 before), but the cross-validation fit is now 0.59 (as opposed to 0.56 with linear regression). We trade a larger error on the training data in order to gain better generalization.