Re: [Rd] R scripts slowing down after repeated called to compiled code

From: Vladimir Dergachev <>
Date: Fri, 25 May 2007 19:29:55 -0400

On Friday 25 May 2007 7:12 pm, Michael Braun wrote:
> Thanks in advance to anyone that might be able to help me with this
> Also, it is not just the compiled call that slows down. EVERYTHING
> slows down, even those that consist only of standard R functions. The
> time for each of these function calls is roughly proportional to the
> time of the .Call to the C function.
> Another observation is that when I terminate the algorithm, do a rm
> (list=ls()), and then a gc(), not all of the memory is returned to the
> OS. It is not until I terminate the R session that I get all of the
> memory back. In my C code, I am not doing anything to de-allocate the
> SEXP's I create, relying on the PROTECT/UNPROTECT mechanism instead (is
> this right?).
> I spent most of the day thinking I have a memory leak, but that no
> longer appears to be the case. I tried using Rprof(), but that only
> gives me the aggregated relative time spent in each function (more than
> 80% of the time, it's in the .Call).

One possibility is that you are somehow creating a lot of R objects (say by calling assign() or missing UNPROTECT()) and this slows garbage collector down. The garbage collector running time will grow with the number of objects you have - their total size does not have to be large.

Could you try printing numbers from gc() call and checking whether the numbers of allocated objects grow a lot ?


                            Vladimir Dergachev

> So I'm stuck. Can anyone help?
> Thanks,
> Michael mailing list Received on Fri 25 May 2007 - 23:34:25 GMT

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.2.0, at Sat 26 May 2007 - 02:33:05 GMT.

Mailing list information is available at Please read the posting guide before posting to the list.