Re: [R] FW: Large datasets in R

From: Marshall Feldman <marsh_at_uri.edu>
Date: Wed 19 Jul 2006 - 07:11:32 EST


Well, SPSS used to claim that all its algorithms dealt with only one case at a time and therefore that it could handle very large files. I suppose a large correlation matrix could cause it problems.

        Marsh Feldmman

-----Original Message-----
From: Ritwik Sinha [mailto:ritwik.sinha@gmail.com] Sent: Tuesday, July 18, 2006 10:54 AM
To: Prof Brian Ripley
Cc: Marshall Feldman; r-help@stat.math.ethz.ch Subject: Re: [R] FW: Large datasets in R

Hi,

I have a related question. How differently do other statistical softwares handle large data?

The original post claims that 350 MB is fine on Stata. Some one suggested S-Plus. I have heard people say that SAS can handle large data sets. Why can others do it and R seem to have a problem? Don't these softwares load the data onto RAM.

-- 
Ritwik Sinha
Graduate Student
Epidemiology and Biostatistics
Case Western Reserve University

http://darwin.cwru.edu/~rsinha

______________________________________________
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Received on Wed Jul 19 07:15:09 2006

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.1.8, at Wed 19 Jul 2006 - 08:17:53 EST.

Mailing list information is available at https://stat.ethz.ch/mailman/listinfo/r-help. Please read the posting guide before posting to the list.