RE: R-alpha: Memory Requirements for Large Datasets

Martyn Plummer (
Tue, 04 Mar 1997 13:37:09 +0100 (MET)

Date: Tue, 04 Mar 1997 13:37:09 +0100 (MET)
From: Martyn Plummer <>
Subject: RE: R-alpha: Memory Requirements for Large Datasets

On 04-Mar-97 wrote:

>I have been frustrated by the apparently memmory hungry nature of R.  I 
>have attempted to read in a matrix (read.table) that is a little over 1MB 
>in size (~4000 observations with 43 traits), but am told I lack 
>sufficient memmory.  It is not until I pare this dataset down to 1000 
>observations (~0.25 MB) that R will accept it.  Running R on Linux-ELF 
>with 16 MB of memmory (on i586-75), I installed 8 MB additional RAM, but 
>still find R incapable of retrieving much more.  

You didn't say whether you tried the -v flag to increase the size of the
vector heap (default is only 2Mb). This is documented in the man page
(R.1) which can be found in the top level of the R source directory tree
if you haven't installed it already.

You'll find that you need a lot more memory to read a dataset in than you
do to work with it. 

By the way if someone wants to explain what a cons cell is (the option set
with the -n flag) I'd be very grateful. I have found that I also need to 
increase this from the default value when reading large data sets.

r-testers mailing list -- For info or help, send "info" or "help",
To [un]subscribe, send "[un]subscribe"
(in the "body", not the subject !)  To: