Finding XGBoost random forests
There are two strategies to implement random forests within XGBoost. The first is to use random forests as the base learner, the second is to use XGBoost's original random forests, XGBRFRegressor
and XGBRFClassifier
. We start with our original theme, random forests as alternative base learners.
Random forests as base learners
There is not an option to set the booster hyperparameter to a random forest. Instead, the hyperparameter num_parallel_tree
may be increased from its default value of 1
to transform gbtree
(or dart
) into a boosted random forest. The idea here is that each boosting round will no longer consist of one tree, but a number of parallel trees, which in turn make up a forest.
The following is a quick summary of the XGBoost hyperparameter num_parallel_tree
.
num_parallel_tree
num_parallel_tree
gives the number of trees, potentially more than 1, that are built during each boosting round:
Default: 1
Range: [1...