Summary
In this chapter, you greatly extended your range of XGBoost by applying all XGBoost base learners, including gbtree
, dart
, gblinear
, and random forests, to regression and classification datasets. You previewed, applied, and tuned hyperparameters unique to base learners to improve scores. Furthermore, you experimented with gblinear
using a linearly constructed dataset and with XGBRFRegressor
and XGBRFClassifier
to build XGBoost random forests without any boosting whatsoever. Now that you have worked with all base learners, your comprehension of the range of XGBoost is at an advanced level.
In the next chapter, you will analyze tips and tricks from Kaggle masters to advance your XGBoost skills even further!