[R] slow aggregate function

From: Bert Jacobs <bert.jacobs_at_figurestofacts.be>
Date: Wed, 05 Nov 2008 11:11:45 +0100

   

Hi,

I've written the following line of code to make a summary of some data:  

Final.Data.Short <- as.data.frame(aggregate(Merge.FinalSubset[,8:167], list(Location = Merge.FinalSubset $Location,Measure = Merge.FinalSubset $Measure,Site = Merge.FinalSubset $Site, Label= Merge.FinalSubset $Label), FUN=sum))  

Where "Merge.FinalSubset" is a dataframe of 2640 rows and 167 columns

The result "Final.Data.Short" is a dataframe of 890 rows and 164 columns  

This operation takes at the moment more than a minute. Now I was wondering if their exist ways to reduce this operation time by using other code or by splitting the original dataframe in smaller bits, make several different aggregations, and recompose the dataframe again?  

Thx for helping me out

Bert      

        [[alternative HTML version deleted]]



R-help_at_r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code. Received on Wed 05 Nov 2008 - 10:11:25 GMT

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.2.0, at Wed 05 Nov 2008 - 17:30:22 GMT.

Mailing list information is available at https://stat.ethz.ch/mailman/listinfo/r-help. Please read the posting guide before posting to the list.

list of date sections of archive