[R] Row limit for read.table

From: Frank McCown <fmccown_at_cs.odu.edu>
Date: Wed 17 Jan 2007 - 15:39:56 GMT


I have been trying to read in a large data set using read.table, but I've only been able to grab the first 50,871 rows of the total 122,269 rows.

> f <-

read.table("
http://www.cs.odu.edu/~fmccown/R/Tchange_rates_crawled.dat", header=TRUE, nrows=123000, comment.char="", sep="\t")
> length(f$change_rate)

[1] 50871

 From searching the email archives, I believe this is due to size limits of a data frame. So...

  1. Why doesn't read.table give a proper warning when it doesn't place every read item into a data frame?
  2. Why isn't there a parameter to read.table that allows the user to specify which columns s/he is interested in? This functionality would allow extraneous columns to be ignored which would improve memory usage.

I've already made a work-around by loading the table into mysql and doing a select on the 2 columns I need. I just wonder why the above 2 points aren't implemented. Maybe they are and I'm totally missing it.

Thanks,
Frank

-- 
Frank McCown
Old Dominion University
http://www.cs.odu.edu/~fmccown/

______________________________________________
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Received on Thu Jan 18 02:50:52 2007

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.1.8, at Wed 17 Jan 2007 - 17:30:24 GMT.

Mailing list information is available at https://stat.ethz.ch/mailman/listinfo/r-help. Please read the posting guide before posting to the list.