R-alpha: Memory Requirements for Large Datasets

Matthew R. Nelson (mrn@superh.hg.med.umich.edu)
Tue, 4 Mar 1997 07:02:23 -0500 (EST)


Date: Tue, 4 Mar 1997 07:02:23 -0500 (EST)
From: "Matthew R. Nelson" <mrn@superh.hg.med.umich.edu>
Subject: R-alpha: Memory Requirements for Large Datasets
To: r-testers@stat.math.ethz.ch
Message-Id: <Pine.3.87.9703040723.A16615-0100000@superh.hg.med.umich.edu>

R-Testers,

I have been frustrated by the apparently memmory hungry nature of R.  I 
have attempted to read in a matrix (read.table) that is a little over 1MB 
in size (~4000 observations with 43 traits), but am told I lack 
sufficient memmory.  It is not until I pare this dataset down to 1000 
observations (~0.25 MB) that R will accept it.  Running R on Linux-ELF 
with 16 MB of memmory (on i586-75), I installed 8 MB additional RAM, but 
still find R incapable of retrieving much more.  

Are there structural limitations within R that limit its ability to deal 
with large objects?  Based on your experience, how much of an increase in 
system memmory would I require before I could start handling data files 
on the order of 2 MB?

Thanks for any suggestions,

Matt
----------------------------------------------------------------------------
Matthew R. Nelson
Dept. of Human Genetics         
University of Michigan              http://www-personal.umich.edu/~ticul/
4711 Medical Science II             email: ticul@umich.edu
Ann Arbor, MI  48109-0618           phone: (313) 647-3151
----------------------------------------------------------------------------


=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
r-testers mailing list -- For info or help, send "info" or "help",
To [un]subscribe, send "[un]subscribe"
(in the "body", not the subject !)  To: r-testers-request@stat.math.ethz.ch
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-