Analyzing XGBoost parameters
In this section, we will analyze the parameters that XGBoost uses to create state-of-the-art machine learning models with a mathematical derivation.
We will maintain the distinction between parameters and hyperparameters as presented in Chapter 2, Decision Trees in Depth. Hyperparameters are chosen before the model is trained, whereas parameters are chosen while the model is being trained. In other words, the parameters are what the model learns from the data.
The derivation that follows is taken from the XGBoost official documentation, Introduction to Boosted Trees, at https://xgboost.readthedocs.io/en/latest/tutorials/model.html.
Learning objective
The learning objective of a machine learning model determines how well the model fits the data. In the case of XGBoost, the learning objective consists of two parts: the loss function and the regularization term.
Mathematically, XGBoost's learning objective may be defined as follows:
...