Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Hands-On Data Analysis with Pandas

You're reading from   Hands-On Data Analysis with Pandas Efficiently perform data collection, wrangling, analysis, and visualization using Python

Arrow left icon
Product type Paperback
Published in Jul 2019
Publisher
ISBN-13 9781789615326
Length 740 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Author (1):
Arrow left icon
Stefanie Molin Stefanie Molin
Author Profile Icon Stefanie Molin
Stefanie Molin
Arrow right icon
View More author details
Toc

Table of Contents (21) Chapters Close

Preface 1. Section 1: Getting Started with Pandas FREE CHAPTER
2. Introduction to Data Analysis 3. Working with Pandas DataFrames 4. Section 2: Using Pandas for Data Analysis
5. Data Wrangling with Pandas 6. Aggregating Pandas DataFrames 7. Visualizing Data with Pandas and Matplotlib 8. Plotting with Seaborn and Customization Techniques 9. Section 3: Applications - Real-World Analyses Using Pandas
10. Financial Analysis - Bitcoin and the Stock Market 11. Rule-Based Anomaly Detection 12. Section 4: Introduction to Machine Learning with Scikit-Learn
13. Getting Started with Machine Learning in Python 14. Making Better Predictions - Optimizing Models 15. Machine Learning Anomaly Detection 16. Section 5: Additional Resources
17. The Road Ahead 18. Solutions
19. Other Books You May Enjoy Appendix

Regularization

When working with regressions, we may look to add a penalty term to our regression equation to reduce overfitting by punishing certain decisions for coefficients made by the model; this is called regularization. We are looking for the coefficients that will minimize this penalty term. The idea is to shrink the coefficients toward zero for features that don't contribute much to reducing the error of the model. Some common techniques are ridge regression, LASSO (short for Least Absolute Shrinkage and Selection Operator) regression, and elastic net regression, which combines the LASSO and ridge penalty terms.

Ridge regression, also called L2 regularization, punishes high coefficients () by adding the sum of the squares of the coefficients to the cost function (which regression looks to minimize when fitting), as per the following penalty term:

This penalty term...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime