[R] problem in reading large files

From: T Mu <muster_at_gmail.com>
Date: Sat 12 Aug 2006 - 10:50:48 EST

I was trying to read a large .csv file (80 colums, 400,000 rows, size of about 200MB). I used scan(), R 2.3.1 on Windows XP. My computer is AMD 2000+ and has 512MB ram.

It sometimes freezes my PC, sometimes just shuts down R quitely.

Is there a way (option, function) to better handle large files?

Seemingly SAS can deal with it with no problem, but I just persuaded my professor transfering to R, so it is quite disappointing.

Please help, thank you.

        [[alternative HTML version deleted]]

R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code. Received on Sat Aug 12 10:54:35 2006

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.1.8, at Sat 12 Aug 2006 - 14:18:33 EST.

Mailing list information is available at https://stat.ethz.ch/mailman/listinfo/r-help. Please read the posting guide before posting to the list.