[R] growing a list sequentially -- memory management

From: erwann rogard <erwann.rogard_at_gmail.com>
Date: Thu, 13 Nov 2008 17:15:01 -0500


hello,

i have something like:

out<-list()

for(i in 1:n){
 data<-gen(...) #fixed size data
 out[[i]]<- fun(data)
}

> object.size(out[[1]])

6824

In principle 1 GB should allow

n = 1024^3/6824 = 157347?

i have about 2GB are not taken by other processes. however, I can see the memory shrinking quite rapidly on my system monitor and have to stop the simulation after only n=300. why such a discrepancy? any remedy?

x86_64-pc-linux/RKWard/R2.8.0/ 4GB

thanks.

        [[alternative HTML version deleted]]



R-help_at_r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code. Received on Thu 13 Nov 2008 - 22:24:02 GMT

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.2.0, at Fri 14 Nov 2008 - 12:30:25 GMT.

Mailing list information is available at https://stat.ethz.ch/mailman/listinfo/r-help. Please read the posting guide before posting to the list.

list of date sections of archive