Stacking
Stacking, or stacked generalization, is also called meta ensembling. It is a model ensembling technique that consists of combining data from multiple models' predictions and using them as features to generate a new model. The stacked model will most likely outperform each of the individual models due to the smoothing effect it adds, as well as due to its ability to "choose" the base model that performs best in certain scenarios. Keeping this in mind, stacking is usually most effective when each of the base models is significantly different from each other.
Stacking is widely used in real-world applications. One popular example comes from the well-known Netflix competition whose two top performers built solutions that were based on stacking models. Netflix is a well-known streaming platform and the competition was about building the best recommendation engine. The winning algorithm was based on feature-weighted-linear-stacking, which basically had meta features...