AdaBoost is one of the earliest boosting algorithms that was used for binary classification. It was proposed by Freund and Schapire in 1996. Many other boosting-based algorithms have since been developed on top of AdaBoost.
Another variation of adaptive boosting is known as AdaBoost-abstain. AdaBoost-abstain allows each baseline classifier to abstain from voting if its dependent feature is missing.
AdaBoost focuses on combining a set of weak learners into a strong learner. The process of an AdaBoost classifier is as follows:
- Initially, a short decision tree classifier is fitted onto the data. The decision tree can just have a single split, which is known as a decision stump. The overall errors are evaluated. This is the first iteration.
- In the second iteration, whatever data is correctly classified...