[R] memory

From: Ferran Carrascosa <ferran.carrascosa_at_gmail.com>
Date: Tue 30 Aug 2005 - 08:00:32 EST


Hi,

I have a matrix with 700.000 x 10.000 cells with floating point data. I would like to work with the entire table but I have a lot of memory problems. I have read the ?memory
I work with Win 2000 with R2.1.0

The only solution that I have applied is:
> memory.limit(size=2048)

But now my problems are:
- I need to work with more than 2 Gb. How I can exceed this limit? - When apply some algorithms, the maximum cells in one object 2*10^9 (aprox.) is reached.

Please could you send me some advises/strategies about the work with large amount of data in R?

R have a way to work with less memory needs?

Thanks in advance,

-- 
Ferran Carrascosa

______________________________________________
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
Received on Tue Aug 30 08:15:36 2005

This archive was generated by hypermail 2.1.8 : Fri 03 Mar 2006 - 03:39:59 EST