Section 1: Bagging and Boosting
An XGBoost model using scikit-learn defaults opens the book after preprocessing data with pandas and building standard regression and classification models. The practical theory behind XGBoost is explored by advancing through decision trees (XGBoost base learners), random forests (bagging), and gradient boosting to compare scores and fine-tune ensemble and tree-based hyperparameters.
This section comprises the following chapters:
Chapter 1, Machine Learning Landscape
Chapter 2, Decision Trees in Depth
Chapter 3, Bagging with Random Forests
Chapter 4, From Gradient Boosting to XGBoost