RE: R-alpha: Memory Requirements for Large Datasets

Richard Spady (spady@torcello.nuff.ox.ac.uk)
Tue, 4 Mar 1997 16:34:14 +0000 (GMT)


Date: Tue, 4 Mar 1997 16:34:14 +0000 (GMT)
From: Richard Spady <spady@torcello.nuff.ox.ac.uk>
To: "Matthew R. Nelson" <mrn@superh.hg.med.umich.edu>
Subject: RE: R-alpha: Memory Requirements for Large Datasets
In-Reply-To: <Pine.3.87.9703040942.B15912-0100000@superh.hg.med.umich.edu>



On Tue, 4 Mar 1997, Matthew R. Nelson wrote:

> 
> Here at work, I am working on a Linux-ELF (i586-90) with 40 MB RAM.  Even 
> cranking up my startup arguements to
>      R -v 30 -n 10000000
> the largest dataset that I could read in was 53 x 2000 (twice the size of 
> the previous largest with the default startup values), which corresponded 
> to an ASCII file of 635 KB.

A little while ago I wrote to r-testers saying that R -v xx was not 
actually working--that is the -v argument appeared not to be setting the 
environment properly  for the executable to pick up. (I may have the 
model of the mechanism wrong, it's been awhile.) This wasn't true in R 0.12
but was a problem by 0.15

This too was on Linux-ELF (Red Hat 3.0.3).

I eventually just ended up recompiling main.c with the memory allocation 
hard wired to 40MB. And now I read huge files just fine.
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
r-testers mailing list -- For info or help, send "info" or "help",
To [un]subscribe, send "[un]subscribe"
(in the "body", not the subject !)  To: r-testers-request@stat.math.ethz.ch
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-