From: Uwe Ligges (firstname.lastname@example.org)
Date: Tue 04 May 2004 - 22:28:40 EST
Janet Rosenbaum wrote:
> Hi. I want to use R with very large files, a couple hundred megabytes,
> but R crashes every time that I try.
> Reading help files seems to indicate that R ought to manage its memory
> itself. I know I have enough memory since stata handles these files
> perfectly well. I have a mac running OS 10.3 and am running RAqua 1.8.1.
> Is there anything I can do to make it deal with these files successfully?
I guess you mean R gives an error, but does *not* crash (if it crashes,
it is a bug that needs to be fixed, and you should cross-check with a
recent version of R).
If it gives an error, either read in the data in a more appropriate way
(if there is any, but we do not know how you tried to read in the data),
or "increase memory" as the subject already suggests.
Remail@example.com mailing list
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
This archive was generated by hypermail 2.1.3 : Mon 31 May 2004 - 23:05:07 EST