Summary
In this chapter, we discussed several important ensemble learning algorithms, including bootstrapping for creating more training sets, bagging for aggregating weak estimators, boosting for improving accuracy without increasing variance too much, and the random forest algorithm.
Ensemble algorithms are very powerful as they are models that build on top of basic models. Understanding them will benefit you in the long run in terms of your data science career.
In the next chapter, we will examine some common mistakes and go over some best practices in data science.