[R] Memory usage of R on Windows XP

About this list Date view Thread view Subject view Author view Attachment view

From: Peter Wilkinson (pwilkinson@videotron.ca)
Date: Fri 21 May 2004 - 10:58:37 EST


Message-id: <5.2.0.9.0.20040520204754.00b3c2b0@pop.videotron.ca>


I am running R 1.8.1 on windows xp. I have been using the 'apply' R
function to run a short function against a matrix of 19000 x 340 rows ...
yes it is a big matrix. every item in the matrix is a float that can have a
maximum value of 2^16 ~ 65k.

The function:

mask256 <- function(value) {
     if (value < 256) {
        result = 0
     }
     else {
        result = 1
     }
     result
}

what happens is that the memory required for the session to run starts
ballooning. The matrix with a few other objects starts at about 160M, but
then quickly goes up to 750M, and stays there when the function has completed

I am fairly new to R. Is there something I should know about writing
functions , i.e. do I need to clean-up at the end of the function? It seems
R can not release the memory once it has been used. When I close the R
application and open the R application again then the memory is back down
to what it is supposed to be, the size of the workspace, plus any new
objects that I have created

Does anybody know what is going on?

Peter

______________________________________________
R-help@stat.math.ethz.ch mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


About this list Date view Thread view Subject view Author view Attachment view

This archive was generated by hypermail 2.1.3 : Mon 31 May 2004 - 23:05:11 EST