Re: [Rd] R scripts slowing down after repeated called to compiled code

From: Dirk Eddelbuettel <edd_at_debian.org>
Date: Fri, 25 May 2007 20:40:18 -0500

On 25 May 2007 at 19:12, Michael Braun wrote: | So I'm stuck. Can anyone help?

It sounds like a memory issue. Your memory may just get fragmented. One tool that may help you find leaks is valgrind -- see the 'R Extensions' manual. I can also recommend the visualisers like kcachegrind (part of KDE).

But it may not be a leak. I found that R just doesn't cope well with many large memory allocations and releases -- I often loop over data request that I subset and process. This drives my 'peak' memory use to 1.5 or 1.7gb on 32bit/multicore machine with 4gb, 6gb or 8gb (but 32bit leading to the hard 3gb per process limit) . And I just can't loop over many such task. So I now use the littler frontend to script this, dump the processed chunks as Rdata files and later re-read the pieces. That works reliably.

So one think you could try is to dump your data in 'gsl ready' format from R, quit R, leave it out of the equation and then see if what happens if you do the iterations in only GSL and your code.

Hth, Dirk

-- 
Hell, there are no rules here - we're trying to accomplish something. 
                                                  -- Thomas A. Edison

______________________________________________
R-devel_at_r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
Received on Sat 26 May 2007 - 01:43:25 GMT

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.2.0, at Sat 26 May 2007 - 13:33:50 GMT.

Mailing list information is available at https://stat.ethz.ch/mailman/listinfo/r-devel. Please read the posting guide before posting to the list.