RE: [R] Absolute ceiling on R's memory usage = 4 gigabytes?

From: laurent buffat <laurent.buffat_at_it-omics.com>
Date: Tue 13 Jul 2004 - 22:35:30 EST


Hi Tae-Hoon,

I am very surprise by your answers :

When I try to make an affybatch with bioconductor and R 1.9.1, I was unable to read and normalise more than 80 HU-133A CEL file with a Linux 32 bits computer and 4 GB of RAM + 8 GB of swap (Of course, without any other process on the computer and I don't' want do to "JustRMA" because I want the probe level information in the affybatch, And it's not a limit in R configuration, because if I follow the memory usage during the R session, R is using all the 4GB RAM memory (the swap is not use) before the memory error)

For this raison we are planning to buy a 64 bits under Linux, but, if with Mac OS X and 1.5 GB of RAM, we can solve this problem, I will buy a Mac and not a linux 64 bits computer.

So, what kind of normalization are you doing ? Some one with bioconductor and the affy package or an other one ? Could you precise ?

For the other R & BioC :

Do you think that there is a difference between linux and MacOS for the memory management under R ?

What is a "good" hardward solution for "R / Linux 64 bits" ?

Thanks for your help.

laurent

-----Message d'origine-----
De : r-help-bounces@stat.math.ethz.ch
[mailto:r-help-bounces@stat.math.ethz.ch] De la part de Tae-Hoon Chung Envoye : vendredi 2 juillet 2004 01:52
A : Kort, Eric
Cc : r-help@stat.math.ethz.ch
Objet : Re: [R] Absolute ceiling on R's memory usage = 4 gigabytes?

Hi, Eric.
It seems a little bit puzzling to me. Which Affymetrix chip do you use? The reason I'm asking this is that yesterday I was able to normalize 150 HU-133A CEL files (containing 22283 probes) using R 1.9.1 in Mac OS X 10.3.3 with 1.5 GB memory. If your chip has more probes than this, then it must be understandable ...

On Jul 1, 2004, at 2:59 PM, Kort, Eric wrote:

> Hello. By way of background, I am running out of memory when
> attempting to normalize the data from 160 affymetrix microarrays using
> justRMA (from the affy package). This is despite making 6 gigabytes
> of swap space available on our sgi irix machine (which has 2 gigabytes
> of ram). I have seen in various discussions statements such as "you
> will need at least 6 gigabytes of memory to normalize that many
> chips", but my question is this:
>
> I cannot set the memory limits of R (1.9.1) higher than 4 gigabytes as
> attempting to do so results in this message:
>
> WARNING: --max-vsize=4098M=4098`M': too large and ignored
>
> I experience this both on my windows box (on which I cannot allocate
> more than 4 gigabytes of swap space anyway), and on an the above
> mentioned sgi irix machine (on which I can). In view of that, I do
> not see what good it does to make > 4 gigabytes of ram+swap space
> available. Does this mean 4 gigabytes is the absolute upper limit of
> R's memory usage...or perhaps 8 gigabytes since you can set both the
> stack and the heap size to 4 gigabytes?
>
> Thanks,
> Eric
>
>
> This email message, including any attachments, is for the
> so...{{dropped}}
>
> ______________________________________________
> R-help@stat.math.ethz.ch mailing list
> https://www.stat.math.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide!
> http://www.R-project.org/posting-guide.html
>
>
Tae-Hoon Chung, Ph.D

Post-doctoral Research Fellow
Molecular Diagnostics and Target Validation Division Translational Genomics Research Institute 1275 W Washington St, Tempe AZ 85281 USA Phone: 602-343-8724



R-help@stat.math.ethz.ch mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide!
http://www.R-project.org/posting-guide.html

R-help@stat.math.ethz.ch mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html Received on Tue Jul 13 22:47:16 2004

This archive was generated by hypermail 2.1.8 : Wed 03 Nov 2004 - 22:54:54 EST