Re: [R] problem in reading large files

From: Duncan Murdoch <murdoch_at_stats.uwo.ca>
Date: Sat 12 Aug 2006 - 12:08:08 EST

On 8/11/2006 8:50 PM, T Mu wrote:
> I was trying to read a large .csv file (80 colums, 400,000 rows, size of
> about 200MB). I used scan(), R 2.3.1 on Windows XP. My computer is AMD 2000+
> and has 512MB ram.

You should get R-patched; there were some bugs with low memory handling fixed recently:

 From CHANGES:

R could crash when very low on memory. (PR#8981)

You should also get more physical memory. 512MB is not much for handling a 200MB of data. You can fairly easily benefit from increasing up to 2 GB, and will benefit (with some work) if you have even more, up to 4 GB.

Duncan Murdoch

>
> It sometimes freezes my PC, sometimes just shuts down R quitely.
>
> Is there a way (option, function) to better handle large files?
>
> Seemingly SAS can deal with it with no problem, but I just persuaded my
> professor transfering to R, so it is quite disappointing.
>
> Please help, thank you.
>
> [[alternative HTML version deleted]]
>
> ______________________________________________
> R-help@stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.



R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code. Received on Sat Aug 12 12:12:30 2006

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.1.8, at Sat 12 Aug 2006 - 16:19:23 EST.

Mailing list information is available at https://stat.ethz.ch/mailman/listinfo/r-help. Please read the posting guide before posting to the list.