[R] Once again in search of memory

About this list Date view Thread view Subject view Author view Attachment view

From: Eric Lecoutre (lecoutre@stat.ucl.ac.be)
Date: Wed 26 May 2004 - 20:23:32 EST

Message-id: <>

Hello any R user,

Working on R 1.8.1 or R 1.9 on Windows, 256 Mb RAM.
I am trying to debug/correct/speedup the execution of one of our student's
program. Basically, we perform simulations.
R is launched with high memory-limits.
One execution of the program requires nearly 200 Mbs and is OK the first
time. Then, launching it again does strange things: seen from Windows
manager, the RAM used by R never exceed those 200 Mbs (often near 130Mb).
seen from R:

> gc(TRUE)
Garbage collection 280 = 85+59+136 (level 2) ...
396784 cons cells free (40%)
145.7 Mbytes of heap free (53%)
            used (Mb) gc trigger (Mb)
Ncells 587240 15.7 984024 26.3
Vcells 16866491 128.7 35969653 274.5

And then each new call to the function
will always imply a grow of this memory use (Vcells)
How comes Windows Manager states R only uses 71 Mb RAM as seen in the
attached screenshot?
Is this a known issue? Is there any tip to "really" release memory for all
those objects we dont use anymore?

I tried also memory.profile() which states there are more list-type
objects. We wanipulated matrices. Could they come from calls to 'apply'?

Thansk for insights and advices on how to handle memory. I am turning and
turning round on help pages.


Eric Lecoutre
UCL / Institut de Statistique
Voie du Roman Pays, 20
1348 Louvain-la-Neuve

tel: (+32)(0)10473050

If the statistics are boring, then you've got the wrong numbers. -Edward

R-help@stat.math.ethz.ch mailing list
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

About this list Date view Thread view Subject view Author view Attachment view

This archive was generated by hypermail 2.1.3 : Mon 31 May 2004 - 23:05:12 EST