The concept of ensemble learning was explored in this chapter, when we learned about random forests, AdaBoost, and gradient boosted trees. However, this concept can be extended to classifiers outside of trees.
If we had built a logistic regression, random forest, and k-nearest neighbors classifiers, and we wanted to group them all together and extract the final prediction through majority voting, then we could do this by using the ensemble classifier.
This concept can be better understood with the aid of the following diagram:
Ensemble learning with a voting classifier to predict fraud transactions
When examining the preceding diagram, note the following:
- The random forest classifier predicted that a particular transaction was fraudulent, while the other two classifiers predicted that the transaction was not fraudulent.
- The voting classifier sees that two...