[R] clara - memory limit

From: Nestor Fernandez <nestor.fernandez_at_ufz.de>
Date: Thu 04 Aug 2005 - 02:44:38 EST


Dear all,

I'm trying to estimate clusters from a very large dataset using clara but the program stops with a memory error. The (very simple) code and the error:

mydata<-read.dbf(file="fnorsel_4px.dbf") my.clara.7k<-clara(mydata,k=7)

>Error: cannot allocate vector of size 465108 Kb

The dataset contains >3,000,000 rows and 15 columns. I'm using a windows computer with 1.5G RAM; I also tried changing the memory limit to the maximum possible (4000M)
Is there a way to calculate clara clusters from such large datasets?

Thanks a lot.

Nestor.-



R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html Received on Thu Aug 04 02:48:09 2005

This archive was generated by hypermail 2.1.8 : Fri 03 Mar 2006 - 03:39:39 EST