Stacking models
– David Austin, Kaggle Winner
In this final section, we will examine one of the most powerful tricks frequently used by Kaggle winners, called stacking.
What is stacking?
Stacking combines machine learning models at two different levels: the base level, whose models make predictions on all the data, and the meta level, which takes the predictions of the base models as input and uses them to generate final predictions.
In other words, the final model in stacking does not take the original data as input, but rather takes the predictions of the base machine learning models as input.
Stacked models have found huge success in Kaggle competitions. Most Kaggle competitions have merger deadlines, where individuals and teams can...