Chapter 3: Bagging with Random Forests
In this chapter, you will gain proficiency in building random forests, a leading competitor to XGBoost. Like XGBoost, random forests are ensembles of decision trees. The difference is that random forests combine trees via bagging, while XGBoost combines trees via boosting. Random forests are a viable alternative to XGBoost with advantages and limitations that are highlighted in this chapter. Learning about random forests is important because they provide valuable insights into the structure of tree-based ensembles (XGBoost), and they allow a deeper understanding of boosting in comparison and contrast with their own method of bagging.
In this chapter, you will build and evaluate random forest classifiers and random forest regressors, gain mastery of random forest hyperparameters, learn about bagging in the machine learning landscape, and explore a case study that highlights some random forest limitations that spurred the development of gradient...