Re: [R] Practical Data Limitations with R

From: hadley wickham <>
Date: Tue, 08 Apr 2008 10:56:21 -0500

> We are new to R and evaluating if we can use it for a project we need to

> do. We have read that R is not well suited to handle very large data
> sets. Assuming we have the data prepped and stored in an RDBMS (Oracle,
> Teradata, SQL Server), what can R reasonably handle from a volume
> perspective? Are there some guidelines on memory/machine sizing based
> on data volume? We need to be able to handle Millions of Rows from
> several sources. Any advice is much appreciated. Thanks.

The most important thing is what type of analysis do you want to do with the data? Is the algorithm that implements the analysis O(n), O(n log n) or O(n^2) ?



______________________________________________ mailing list
PLEASE do read the posting guide
and provide commented, minimal, self-contained, reproducible code.
Received on Tue 08 Apr 2008 - 15:59:50 GMT

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.2.0, at Tue 08 Apr 2008 - 16:30:27 GMT.

Mailing list information is available at Please read the posting guide before posting to the list.

list of date sections of archive