Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Data Analysis with R, Second Edition

You're reading from   Data Analysis with R, Second Edition A comprehensive guide to manipulating, analyzing, and visualizing data in R

Arrow left icon
Product type Paperback
Published in Mar 2018
Publisher Packt
ISBN-13 9781788393720
Length 570 pages
Edition 2nd Edition
Languages
Tools
Arrow right icon
Author (1):
Arrow left icon
Tony Fischetti Tony Fischetti
Author Profile Icon Tony Fischetti
Tony Fischetti
Arrow right icon
View More author details
Toc

Table of Contents (19) Chapters Close

Preface 1. RefresheR 2. The Shape of Data FREE CHAPTER 3. Describing Relationships 4. Probability 5. Using Data To Reason About The World 6. Testing Hypotheses 7. Bayesian Methods 8. The Bootstrap 9. Predicting Continuous Variables 10. Predicting Categorical Variables 11. Predicting Changes with Time 12. Sources of Data 13. Dealing with Missing Data 14. Dealing with Messy Data 15. Dealing with Large Data 16. Working with Popular R Packages 17. Reproducibility and Best Practices 18. Other Books You May Enjoy

Random forests


The final classifier that we will be discussing in this chapter is the aptly named Random Forest and is an example of a meta-technique called ensemble learning. The idea and logic behind random forests follows.

Given that (unpruned) decision trees can be nearly bias-less high variance classifiers, a method of reducing variance at the cost of a marginal increase of bias could greatly improve upon the predictive accuracy of the technique. One salient approach to reducing the variance of decision trees is to train a bunch of unpruned decision trees on different random subsets of the training data, sampling with replacement—this is called bootstrap aggregating or bagging. At the classification phase, the test observation is run through all of these trees (a forest, perhaps?), and each resulting classification casts a vote for the final classification of the whole forest. The class with the highest number of votes is the winner. It turns out that the consensus among many high-variance...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime