AdaBoost is one of the most popular boosting algorithms. Similar to bagging, the main idea behind the algorithm is to create a number of uncorrelated weak learners and then combine their predictions. The main difference with bagging is that instead of creating a number of independent bootstrapped train sets, the algorithm sequentially trains each weak learner, assigns weights to all instances, samples the next train set based on the instance's weights, and repeats the whole process. As a base learner algorithm, usually decision trees consisting of a single node are used. These decision trees, with a depth of a single level, are called decision stumps.
AdaBoost
Weighted sampling
Weighted sampling is the sampling process...