Chapter 9: XGBoost Kaggle Masters
In this chapter, you will learn valuable tips and tricks from Kaggle Masters who used XGBoost to win Kaggle competitions. Although we will not enter a Kaggle competition here, the skills that you will gain can apply to building stronger machine learning models in general. Specifically, you will learn why an extra hold-out set is critical, how to feature engineer new columns of data with mean encoding, how to implement VotingClassifier
and VotingRegressor
to build non-correlated machine learning ensembles, and the advantages of stacking a final model.
In this chapter, we will cover the following main topics:
Exploring Kaggle competitions
Engineering new columns of data
Building non-correlated ensembles
Stacking final models