One of the most powerful ensemble learning techniques is boosting. It allows complicated models to be generated. In this section, we will utilize XGBoost to model our time series data. As there are many degrees of freedom (hyperparameters) when modeling with XGBoost, we expect some level of fine-tuning to be needed to achieve satisfactory results. By replacing our example's regressor with lr = XGBRegressor(), we can utilize XGBoost and fit it onto our data. This results in an MSE of 19.20 and a Sharpe value of 0.13.
Figure depicts the profits and trades generated by the model. Although the Sharpe value is lower than for other models, we can see that it continues to generate profit, even during periods in which the Bitcoin price drops:
Trades generated by the Boosting model