Re: [R] naive question

From: <rivin_at_euclid.math.temple.edu>
Date: Thu 01 Jul 2004 - 03:56:43 EST

> I suspect that what has happened is that memory, CPU speed, and I/O
> speed have evolved at different rates, so what used to be acceptable
> code in read.table() (in both R and S-plus) is now showing its
> limitations and has reached the point where it can take half an hour to
> read in, on a readily-available computer, the largest data table that
> can be comfortably handled. I'm speculating, but 10 years ago, on a
> readily available computer, did it take half an hour to read in the
> largest data table that could be comfortably handled in S-plus or R?

I did not use R ten years ago, but "reasonable" RAM amounts have multiplied by roughly a factor of 10 (from 128Mb to 1Gb), CPU speeds have gone up by a factor of 30 (from 90Mhz to 3Ghz), and disk space availabilty has gone up probably by a factor of 10. So, unless the I/O performance scales nonlinearly with size (a bit strange but not inconsistent with my R experiments), I would think that things should have gotten faster (by the wall clock, not slower). Of course, it is possible that the other components of the R system have been worked on more -- I am not equipped to comment...

  Igor

>



R-help@stat.math.ethz.ch mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html Received on Thu Jul 01 04:02:41 2004

This archive was generated by hypermail 2.1.8 : Fri 18 Mar 2005 - 08:10:05 EST