Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Data-Centric Machine Learning with Python

You're reading from   Data-Centric Machine Learning with Python The ultimate guide to engineering and deploying high-quality models based on good data

Arrow left icon
Product type Paperback
Published in Feb 2024
Publisher Packt
ISBN-13 9781804618127
Length 378 pages
Edition 1st Edition
Languages
Arrow right icon
Authors (3):
Arrow left icon
Jonas Christensen Jonas Christensen
Author Profile Icon Jonas Christensen
Jonas Christensen
Manmohan Gosada Manmohan Gosada
Author Profile Icon Manmohan Gosada
Manmohan Gosada
Nakul Bajaj Nakul Bajaj
Author Profile Icon Nakul Bajaj
Nakul Bajaj
Arrow right icon
View More author details
Toc

Table of Contents (17) Chapters Close

Preface 1. Part 1: What Data-Centric Machine Learning Is and Why We Need It FREE CHAPTER
2. Chapter 1: Exploring Data-Centric Machine Learning 3. Chapter 2: From Model-Centric to Data-Centric – ML’s Evolution 4. Part 2: The Building Blocks of Data-Centric ML
5. Chapter 3: Principles of Data-Centric ML 6. Chapter 4: Data Labeling Is a Collaborative Process 7. Part 3: Technical Approaches to Better Data
8. Chapter 5: Techniques for Data Cleaning 9. Chapter 6: Techniques for Programmatic Labeling in Machine Learning 10. Chapter 7: Using Synthetic Data in Data-Centric Machine Learning 11. Chapter 8: Techniques for Identifying and Removing Bias 12. Chapter 9: Dealing with Edge Cases and Rare Events in Machine Learning 13. Part 4: Getting Started with Data-Centric ML
14. Chapter 10: Kick-Starting Your Journey in Data-Centric Machine Learning 15. Index 16. Other Books You May Enjoy

Ensemble techniques

Ensemble techniques are powerful methods used to improve the performance of machine learning models, particularly in scenarios with imbalanced datasets, rare events, and edge cases. These techniques combine multiple base models to create a more robust and accurate final prediction. Let’s discuss some popular ensemble techniques.

Bagging

Bootstrap aggregating (bagging) is an ensemble technique that creates multiple bootstrap samples (random subsets with replacement) from the training data and trains a separate base model on each sample. The final prediction is obtained by averaging or voting the predictions of all base models. Bagging is particularly useful when dealing with high variance and complex models, as it reduces overfitting and enhances the model’s generalization ability. Here are the key concepts associated with bagging:

  • Bootstrap sampling: The bagging process begins by creating multiple random subsets of the training data through...
lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image