XGBoost Library
The library we used to perform the above classification is named XGBoost. The library enables a lot of customization using the many parameters it has. In the following sections, we will dive in and understand the different parameters and functions of the XGBoost library.
Note
For more information about XGBoost, refer the website: https://xgboost.readthedocs.io
Training
Parameters that affect the training of any XGBoost model are listed below.
- booster: Even though we mentioned in the introduction that the base learner of XGBoost is a regression tree, using this library, we can use linear regression as the weak learner as well. Another weak learner, DART booster, is a new method to tree boosting, which drops trees at random to prevent overfitting. To use tree boosting, pass "gbtree" (default); for linear regression, pass "gblinear"; and for tree boosting with dropout, pass "dart".
Note
You may learn more about DART from this paper: http://www.jmlr...