The idea behind boosting is that by building successive models that are built to predict the misclassifications of earlier models you're performing a form of error modeling. Bagging, on the other hand, is sampling with replacement. With this method, new training datasets are generated which are of the same size as the original dataset. For our example in this section, will be using a bootstrap sample.
In this example, we're going to see how to do boosting and bagging, which are two methods of improving a model.