Bootstrap aggregation, also known as bagging, is a powerful ensemble method that was proposed by Leo Breiman in 1994 to prevent overfitting. The concept behind bagging is to combine the predictions of several base learners to create a more accurate output.
Breiman showed that bagging can successfully achieve the desired result in unstable learning algorithms where small changes to the training data can lead to large variations in the predictions. Breiman demonstrated that algorithms such as neural networks and decision trees are examples of unstable learning algorithms. Bootstrap aggregation is effective on small datasets.
The general procedure for bagging helps to reduce variance for those algorithms have high variance. Bagging also supports the classification and regression problem. The following diagram shows how the bootstrap aggregation flow works...