Until now, we have seen single learning algorithms of growing complexity. Ensembles represent an effective alternative since they achieve better predictive accuracy by combining or chaining the results from models based on different data samples and algorithm settings. Ensemble strategies divide themselves into two branches. According to the method used, they assemble predictions together by the following:
- Averaging algorithms: These make predictions by averaging the results of various parallel estimators. The variations in the estimators provide further division into four families: pasting, bagging, subspaces, and patches.
- Boosting algorithms: These make predictions by using a weighted average of sequential aggregated estimators.
Before delving into some examples for both classification and regression, we will provide you with the necessary steps to reload...