Bagging
Bagging stands for Boostap AGGregatING. This was invented by Breiman (1994). Bagging is an example of an homogeneous ensemble and this is because the base learning algorithm remains as the classification tree. Here, each bootstrap tree will be a base learner. This also means that when we bootstrapped the linear regression model in Chapter 2, Bootstrapping, we actually performed an ensemble there. A few remarks with regards to combining the results of multiple trees is in order here.
Ensemble methods combine the outputs from multiple models, also known as base learners, and produce a single result. A benefit of this approach is that if each of these base learners possesses a desired property, then the combined result will have increased stability. If a certain base learner is over-trained in a specific region of the covariate space, the other base learner will nullify such an undesired prediction. It is the increased stability that is expected from the ensemble, and bagging many...