[R] Absolute ceiling on R's memory usage = 4 gigabytes?

From: Kort, Eric <Eric.Kort_at_vai.org>
Date: Fri 02 Jul 2004 - 07:59:10 EST


Hello. By way of background, I am running out of memory when attempting to normalize the data from 160 affymetrix microarrays using justRMA (from the affy package). This is despite making 6 gigabytes of swap space available on our sgi irix machine (which has 2 gigabytes of ram). I have seen in various discussions statements such as "you will need at least 6 gigabytes of memory to normalize that many chips", but my question is this:

I cannot set the memory limits of R (1.9.1) higher than 4 gigabytes as attempting to do so results in this message:

WARNING: --max-vsize=4098M=4098`M': too large and ignored

I experience this both on my windows box (on which I cannot allocate more than 4 gigabytes of swap space anyway), and on an the above mentioned sgi irix machine (on which I can). In view of that, I do not see what good it does to make > 4 gigabytes of ram+swap space available. Does this mean 4 gigabytes is the absolute upper limit of R's memory usage...or perhaps 8 gigabytes since you can set both the stack and the heap size to 4 gigabytes?

Thanks,
Eric

This email message, including any attachments, is for the so...{{dropped}}



R-help@stat.math.ethz.ch mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html Received on Fri Jul 02 08:03:36 2004

This archive was generated by hypermail 2.1.8 : Fri 18 Mar 2005 - 09:14:13 EST