Traditional limitations of R
The usual scenario is simple. You've mined or collected unusually large amounts of data as part of your professional work, or university research, and you appreciate the flexibility of the R language and its ever-growing, rich landscape of useful and open-source libraries. So what next? Before too long you will be faced with two traditional limitations of R:
Data must fit within the available RAM
R is generally very slow compared to other languages
Out-of-memory data
The first of the claims against using R for Big Data is that the entire dataset you want to process has to be smaller than the amount of available RAM. Currently, most of the commercially sold, off-the-shelf personal computers are equipped with anything from 4GB to 16GB of RAM, meaning that these values will be the upper bounds of the size of your data which you will want to analyze with R. Of course, from these upper limits, you still need to deduct some additional memory resources for other processes...