Re: [R] Memory problem?

From: Jay Emerson <>
Date: Thu, 31 Jan 2008 08:31:07 -0500


Page 23 of the R Installation Guide provides some memory guidelines that you might find helpful.

There are a few things you could try using R, at least to get up and running:

With any of these options, you are still very much restricted by the type of analysis you are attempting. Almost any existing procedure (e.g. a cox model) would need a regular R object (probably impossible) and you are back to square one. An exception to this is Thomas Lumley's biglm package, which processes the data in chunks. We need more tools like these. Ultimately, you'll need to find some method of analysis that is pretty smart memory-wise, and this may not be easy.

Best of luck,


Original message:

I am trying to run a cox model for the prediction of relapse of 80 cancer tumors, taking into account the expression of 17000 genes. The data are large and I retrieve an error:
"Cannot allocate vector of 2.4 Mb". I increase the memory.limit to 4000 (which is the largest supported by my computer) but I still retrieve the error because of other big variables that I have in the workspace. Does anyone know how to overcome this problem?

Many thanks in advance,

John W. Emerson (Jay)
Assistant Professor of Statistics
Director of Graduate Studies
Department of Statistics
Yale University
Statistical Consultant, REvolution Computing

______________________________________________ mailing list
PLEASE do read the posting guide
and provide commented, minimal, self-contained, reproducible code.
Received on Thu 31 Jan 2008 - 13:38:11 GMT

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.2.0, at Thu 31 Jan 2008 - 14:00:09 GMT.

Mailing list information is available at Please read the posting guide before posting to the list.

list of date sections of archive