Boosting
Boosting is a family of ensemble methods that are primarily used to reduce the bias of an estimator. Boosting can be used in classification and regression tasks. Like bagging, boosting creates ensembles of homogeneous estimators. We will focus our discussion of boosting on one of the most popular boosting algorithms, AdaBoost.
AdaBoost is an iterative algorithm that was formulated by Yoav Freund and Robert Schapire in 1995. It's name is a portmanteau of adaptive boosting. On the first iteration, AdaBoost assigns equal weights to all of the training instances and then trains a weak learner. A weak learner (or weak classifier, weak predictor, and so on), is defined only as an estimator that performs slightly better than random chance, such as a decision tree with one or a small number of nodes. Weak learners are often, but not necessarily, simple models. A strong learner, in contrast, is defined as an estimator that is arbitrarily better than a weak learner. Most boosting algorithms...