It is almost a fact that combined forecasts tend to work better than single forecasts. This phenomenon is called the Wisdom of the Crowd. Random forests exploit the Wisdom of the Crowd while fitting and combining several trees. Due to this combination task, algorithms such as random forests are also called ensemble learning. Random forests are not the only ensemble learning algorithms; bagging, boosting, and committees also fit several models.
In this section, we are not only aiming at random forests but all those other kinds of models and packages that could possibly compete with them. This time we are not benchmarking accuracy only. Time elapsed will be taken into consideration. Note that it is a very simple measure and may widely vary from my end to yours.