Re: [R] increasing memory

About this list Date view Thread view Subject view Author view Attachment view

From: Uwe Ligges (ligges@statistik.uni-dortmund.de)
Date: Tue 04 May 2004 - 22:28:40 EST


Message-id: <40978C78.7050905@statistik.uni-dortmund.de>

Janet Rosenbaum wrote:

> Hi. I want to use R with very large files, a couple hundred megabytes,
> but R crashes every time that I try.
>
> Reading help files seems to indicate that R ought to manage its memory
> itself. I know I have enough memory since stata handles these files
> perfectly well. I have a mac running OS 10.3 and am running RAqua 1.8.1.
>
> Is there anything I can do to make it deal with these files successfully?
>
> Janet

I guess you mean R gives an error, but does *not* crash (if it crashes,
it is a bug that needs to be fixed, and you should cross-check with a
recent version of R).

If it gives an error, either read in the data in a more appropriate way
(if there is any, but we do not know how you tried to read in the data),
or "increase memory" as the subject already suggests.

Uwe Ligges

______________________________________________
R-help@stat.math.ethz.ch mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


About this list Date view Thread view Subject view Author view Attachment view

This archive was generated by hypermail 2.1.3 : Mon 31 May 2004 - 23:05:07 EST