[R] What is the largest in memory data object you've worked with in R?

From: Nathan Stephens <nwstephens_at_gmail.com>
Date: Fri, 04 Jun 2010 17:32:29 -0500


For me, I've found that I can easily work with 1 GB datasets. This includes linear models and aggregations. Working with 5 GB becomes cumbersome. Anything over that, and R croaks. I'm using a dual quad core Dell with 48 GB of RAM.

I'm wondering if there is anyone out there running jobs in the 100 GB range. If so, what does your hardware look like?

--Nathan

        [[alternative HTML version deleted]]



R-help_at_r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code. Received on Fri 04 Jun 2010 - 22:35:56 GMT

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.2.0, at Sat 05 Jun 2010 - 21:30:27 GMT.

Mailing list information is available at https://stat.ethz.ch/mailman/listinfo/r-help. Please read the posting guide before posting to the list.

list of date sections of archive