Boosted trees: AdaBoost, XGboost, LightGBM, and CatBoost
Boosted machine learning models were first introduced around 1989 and have been shown to perform well. Some of the more common boosting algorithms you will see are AdaBoost, gradient boosting, XGBoost, LightGBM, and CatBoost.
XGBoost has been used to win several machine learning competitions (for example, on Kaggle), and was initially released in 2014. LightGBM was developed shortly after by Microsoft and released in 2016, while CatBoost was released in 2017. (For a more detailed history of boosting, see this paper: https://cseweb.ucsd.edu/~yfreund/papers/IntroToBoosting.pdf.) These boosting algorithms have slightly different algorithms and implementations, and when trying models on a dataset, it doesn't hurt to try as many of them as you can. An easy way to do this is with the PyCaret package we covered in the previous chapter on model optimization and AutoML.
First, let's learn about how boosting works....