[R] Memory limits for large data sets

From: Alan Cohen <CohenA_at_smh.toronto.on.ca>
Date: Wed, 05 Nov 2008 14:53:35 -0500


I have several very large data sets (1-7 million observations, sometimes hundreds of variables) that I'm trying to work with in R, and memory seems to be a big issue. I'm currently using a 2 GB Windows setup, but might have the option to run R on a server remotely. Windows R seems basically limited to 2 GB memory if I'm right; is there the possibility to go much beyond that with server-based R? In other words, am I limited by R or by my hardware, and how much might R be able to handle if I get the hardware necessary?

Also, any possibility of using web-based R for this kind of thing?

Alan Cohen

Alan Cohen
Post-doctoral Fellow
Centre for Global Health Research
70 Richmond St. East, Suite 202A
Toronto, ON M5C 1N8
(416) 854-3121 (cell)
(416) 864-6060 ext. 3156 (0ffice)

R-help_at_r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code. Received on Wed 05 Nov 2008 - 20:23:45 GMT

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.2.0, at Thu 06 Nov 2008 - 10:30:23 GMT.

Mailing list information is available at https://stat.ethz.ch/mailman/listinfo/r-help. Please read the posting guide before posting to the list.

list of date sections of archive