Performing cross-validation with the bagging method
To assess the prediction power of a classifier, you can run a cross-validation method to test the robustness of the classification model. In this recipe, we will introduce how to use bagging.cv
to perform cross-validation with the bagging
method.
Getting ready
In this recipe, we continue to use the telecom churn
dataset as the input data source to perform a k-fold cross-validation with the bagging
method.
How to do it...
Perform the following steps to retrieve the minimum estimation errors by performing cross-validation with the bagging
method:
- First, we use
bagging.cv
to make a 10-fold classification on the training dataset with 10 iterations:
> churn.baggingcv = bagging.cv(churn ~ ., v=10, data=trainset,
mfinal=10)
- You can then obtain the confusion matrix from the cross-validation results:
> churn.baggingcv$confusion Output Observed Class Predicted Class yes no ...