Comparing dart
The base learner dart
is similar to gbtree
in the sense that both are gradient boosted trees. The primary difference is that dart
removes trees (called dropout) during each round of boosting.
In this section, we will apply and compare the base learner dart
to other base learners in regression and classification problems.
DART with XGBRegressor
Let's see how dart
performs on the Diabetes dataset:
First, redefine
X
andy
usingload_diabetes
as before:X, y = load_diabetes(return_X_y=True)
To use
dart
as the XGBoost base learner, set theXGBRegressor
parameterbooster='dart'
inside theregression_model
function:regression_model(XGBRegressor(booster='dart', objective='reg:squarederror'))
The score is as follows:
65.96444746130739
The dart
base learner gives the same result as the gbtree
base learner down to two decimal places. The similarity of results is on account of the small dataset and the success of the gbtree...