In the previous chapter, we spoke of solutions to common problems that fall under the umbrella term of messy data. In this chapter, we are going to solve some of the problems related to working with large datasets.
Problems, in case of working with large datasets, can occur in R for a few reasons. For one, R (and most other languages, for that matter) was developed during a time when commodity computers only had one processor/core. This means that the vanilla R code can't exploit multiple processor/multiple cores, which can offer substantial speed-ups. Another salient reason why R might run into trouble analyzing large datasets is because R requires the data objects that it works with to be stored completely in RAM memory. If your dataset exceeds the capacity of your RAM, your analyses will slow down to a crawl.
When one thinks of problems related...