RE: [R] about memory

From: Huntsinger, Reid <>
Date: Thu 31 Mar 2005 - 01:23:57 EST

hclust creates a distance matrix. In your case it is 10,000 x 10,000. For various reasons several copies are created, so you probably need at least

100M x 8 bytes per entry x 3 copies = 2.4 GB

just for the distance matrix. If you don't have that much RAM the computation will probably take longer than you're willing to wait.

Reid Huntsinger

-----Original Message-----
[] On Behalf Of ronggui Sent: Wednesday, March 30, 2005 5:37 AM
Subject: [R] about memory

here is my system memory:
ronggui@0[ronggui]$ free

             total       used       free     shared    buffers     cached
Mem:        256728      79440     177288          0       2296      36136
-/+ buffers/cache:      41008     215720
Swap:       481908      60524     421384

and i want to cluster my data using data has 3 variables and 10000 cases.but it fails and saying have not enough memory for the vector size. I read the help doc and use $R --max-vsize=800M to start the R 2.1.0beta under debian linux.but it still can not get the is my pc'memory not enough to carry this analysis or my mistake on setting the memory?

thank you. mailing list PLEASE do read the posting guide! mailing list PLEASE do read the posting guide! Received on Thu Mar 31 01:30:22 2005

This archive was generated by hypermail 2.1.8 : Fri 03 Mar 2006 - 03:30:57 EST