Ensembling by voting
Ensembling by voting can be used efficiently for classification problems. We now have a set of classifiers, and we need to use them to predict the class of an unknown case. The combining of the predictions of the classifiers can proceed in multiple ways. The two options that we will consider are majority voting, and weighted voting.
Majority voting
Ideas related to voting will be illustrated through an ensemble based on the homogeneous base learners of decision trees, as used in the development of bagging and random forests. First, we will create 500 base learners using the randomForest
function and repeat the program in the first block, as seen in Chapter 4, Random Forests. Ensembling has already been performed in that chapter, and we will elaborate on those steps here. First, the code block for setting up the random forest is given here:
> load("../Data/GC2.RData") > set.seed(12345) > Train_Test <- sample(c("Train","Test"),nrow...