Summary
In this chapter, we've implemented a XGBoost classification model. We've remembered the business scenario that was already used in Chapter 6, Classifying Trees with Multiclass Logistic Regression, based on the need to automatically classify New York City trees. After that, we've learned the basics of the XGBoost boosted tree classification model.
In order to build an effective model, we performed data quality checks and then segmented the dataset according to our needs into three tables: one to host training data, a second one for the evaluation stage, and a last one to apply our classification model.
During the training phase of the BigQuery ML model, we've constantly improved the performance of the ML model, using ROC AUC as a key performance indicator (KPI).
After that, we've evaluated the best ML model on a new set of records to avoid any overfitting, becoming more confident about the good quality of our XGBoost classification model.
...