An overview of XGBoost
XGBoost, short for eXtreme Gradient Boosting, is a widely popular open source gradient boosting library with similar goals and functionality to LightGBM. XGBoost is older than LightGBM and was developed by Tianqi Chen and initially released in 2014 [1].
At its core, XGBoost implements GBDTs and supports building them highly efficiently. Some of the main features of XGBoost are as follows:
- Regularization: XGBoost incorporates both L1 and L2 regularization to avoid overfitting
- Sparsity awareness: XGBoost efficiently handles sparse data and missing values, automatically learning the best imputation strategy during training
- Parallelization: The library employs parallel and distributed computing techniques to train multiple trees simultaneously, significantly reducing training time
- Early stopping: XGBoost provides an option to halt the training process if there is no significant improvement in the model’s performance, improving performance...