From: Eric Lecoutre (firstname.lastname@example.org)
Date: Wed 26 May 2004 - 20:23:32 EST
Hello any R user,
Working on R 1.8.1 or R 1.9 on Windows, 256 Mb RAM.
I am trying to debug/correct/speedup the execution of one of our student's
program. Basically, we perform simulations.
R is launched with high memory-limits.
One execution of the program requires nearly 200 Mbs and is OK the first
time. Then, launching it again does strange things: seen from Windows
manager, the RAM used by R never exceed those 200 Mbs (often near 130Mb).
seen from R:
Garbage collection 280 = 85+59+136 (level 2) ...
396784 cons cells free (40%)
145.7 Mbytes of heap free (53%)
used (Mb) gc trigger (Mb)
Ncells 587240 15.7 984024 26.3
Vcells 16866491 128.7 35969653 274.5
And then each new call to the function
will always imply a grow of this memory use (Vcells)
How comes Windows Manager states R only uses 71 Mb RAM as seen in the
Is this a known issue? Is there any tip to "really" release memory for all
those objects we dont use anymore?
I tried also memory.profile() which states there are more list-type
objects. We wanipulated matrices. Could they come from calls to 'apply'?
Thansk for insights and advices on how to handle memory. I am turning and
turning round on help pages.
UCL / Institut de Statistique
Voie du Roman Pays, 20
If the statistics are boring, then you've got the wrong numbers. -Edward
Remail@example.com mailing list
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
This archive was generated by hypermail 2.1.3 : Mon 31 May 2004 - 23:05:12 EST