Decision trees are prone to overfitting training data and suffer from high variance, thus, providing poor predictions from new unseen data. However, using an ensemble of decision trees helps alleviate the shortcoming of using a single decision tree model. In an ensemble, many weak learners come together to create a strong learner.
Among the many ways that we can combine decision trees to make ensembles, the two methods that have been popular due to their performance for predictive modeling are:
- Gradient boosting (also known as gradient tree boosting)
- Random decision trees (also known as random forests)