Re: [R] Big matrix memory problem

From: Spencer Graves <spencer.graves_at_pdf.com>
Date: Sat 14 May 2005 - 09:57:40 EST

          S-Plus 7 advertises facilities for large data sets (http://www.insightful.com/products/splus/default.asp#largedata). Their web site says they do this with "New Pipeline Architecture" that "streams large data sets through available RAM instead of reading the entire data set into memory at once." It also "includes a new data type for dealing with very large data objects". If you want more than this, I suggest you post to "S-News List <s-news@lists.biostat.wustl.edu>"; I haven't used it.

	  hope this helps.
	  spencer graves

Gabor Grothendieck wrote:

> On 5/13/05, s104100026 <n.d.fitzgerald@mars.ucc.ie> wrote:
> 

>>Hi All,
>>
>>I want to read 256 1000x1000 matrices into R. I understand that it is unlikely
>>that I can do this but In the hope that somebody can help me I am mailing this
>>list.
>>
>>I have tried increasing my memory size (I understand that it is the minimum of
>>1024 or the computers RAM in my case 512)
>>
>>Does anyone think this is possible in R, could it be tried in Splus for
>>example.
>>
> 
> 
> If they are sparse you could try the SparseM package.
> 
> ______________________________________________
> R-help@stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

______________________________________________
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html Received on Sat May 14 10:03:16 2005

This archive was generated by hypermail 2.1.8 : Fri 03 Mar 2006 - 03:31:46 EST