RE: [R] Absolute ceiling on R's memory usage = 4 gigabytes?

From: Kort, Eric <>
Date: Fri 02 Jul 2004 - 08:14:49 EST

>From: Liaw, Andy []
>Did you compile R as 64-bit executable on the Irix? If not, R will be
>subjected to the 4GB limit of 32-bit systems.


>Search the archive for `Opteron' and you'll see that the limit is not 4GB,
>for 64-bit executables.

Excellent. I will recompile and try again.


>> From: Kort, Eric
>> Hello. By way of background, I am running out of memory when
>> attempting to normalize the data from 160 affymetrix
>> microarrays using justRMA (from the affy package). This is
>> despite making 6 gigabytes of swap space available on our sgi
>> irix machine (which has 2 gigabytes of ram). I have seen in
>> various discussions statements such as "you will need at
>> least 6 gigabytes of memory to normalize that many chips",
>> but my question is this:
>> I cannot set the memory limits of R (1.9.1) higher than 4
>> gigabytes as attempting to do so results in this message:
>> WARNING: --max-vsize=4098M=4098`M': too large and ignored
>> I experience this both on my windows box (on which I cannot
>> allocate more than 4 gigabytes of swap space anyway), and on
>> an the above mentioned sgi irix machine (on which I can). In
>> view of that, I do not see what good it does to make > 4
>> gigabytes of ram+swap space available. Does this mean 4
>> gigabytes is the absolute upper limit of R's memory
>> usage...or perhaps 8 gigabytes since you can set both the
>> stack and the heap size to 4 gigabytes?
>> Thanks,
>> Eric
This email message, including any attachments, is for the so...{{dropped}} mailing list PLEASE do read the posting guide! Received on Fri Jul 02 08:18:27 2004

This archive was generated by hypermail 2.1.8 : Wed 03 Nov 2004 - 22:54:39 EST