Random Forests: Ensembles of Decision Trees
As we saw in the previous exercise, decision trees are prone to overfitting. This is one of the principle criticisms of their usage, despite the fact that they are highly interpretable. We were able to limit this overfitting, to an extent, however, by limiting the maximum depth to which the tree could be grown.
It turns out that there are powerful and widely-used predictive models that use decision trees as the basis for more complex procedures. In particular, we will focus here on random forests of decision trees. Random forests are examples of what are called ensemble models, because they are formed by combining other models. By combining the predictions of many models, it is possible to improve upon the deficiencies of any given one of them.
Once you understand decision trees, the concept behind random forests is actually quite simple. That is because random forests are just ensembles of many decision trees; all the models in this kind of ensemble...