In general, in order to construct a random forest, first we have to choose the number of trees that it will contain. A random forest does not tend to overfit (unless the data is very noisy), so choosing many decision trees will not decrease the accuracy of the prediction. A random forest does not tend to overfit (unless the data is very noisy), so having a higher number of decision trees will not decrease the accuracy of the prediction. It is important to have a sufficient number of decision trees so that more data is used for classification purposes when chosen randomly for the construction of a decision tree. On the other hand, the more decision trees there are, the more computational power is required. Also, increasing the number of decision trees fails to increase the accuracy of the classification by any significant degree.
In practice...