Random forests are extensions of decision trees and are a kind of ensemble method.Â
Ensemble methods can achieve high accuracy by building several classifiers and running a each one independently. When a classifier makes a decision, you can make use of the most common and the average decision. If we use the most common method, it is called voting.
Here's a diagram depicting the ensemble method:
You can think of each classifier as being specialized for a unique perspective on the data. Each classifier may be a different type. For example, you can combine a decision tree and a logistic regression and a neural net, or the classifiers may be the same type but trained on different parts or subsets of the training data.
A random forest is a collection or ensemble of decision trees. Each tree is trained on a random subset of the attributes, as shown in...