Re: [R] Speed up an R code

From: jim holtman <>
Date: Fri, 27 May 2011 15:28:59 -0400

Take a small subset of your program that would run through the critical sections and use ?Rprof to see where some of the hot spot are. How do you know it is not using the CPU? Are you using perfmon to look what is being used? Are you paging? If you are not paging, and not doing a lot of I/O, then you should tie up one CPU 100% if you are CPU bound.

You probably need to put some output in your program to mark its progress. At a minimum, do the following:

cat('I am here', proc.time(), '\n')

By hcaning the initial string, you can see where you are and this is also reporting the user CPU, system CPU and elapsed time. This should be a good indication of where time is being spent. So there are a number of things you can do to "instrument" your code. If I had a program that was running for hours, I would definitely have something that tell me where I am at and how much time is being taken. If you have some large loop, then you could put out this information every 'n'th time through. The tag on the message would indicate this. There is also the progress bar that I use a lot to see if I amd "making" progress.

After you have instrumented your code and have use Rprof, you might have some data that people would help you with.

If you are using dataframe a lot, remember that indexing into them can be costly. Converting them to matrices, if appropriate, can give a big speed. Rprof will show you this.

On Fri, May 27, 2011 at 2:00 PM, Debs Majumdar <> wrote:
> Hello,
>   Are there some basic things one can do to speed up a R code? I am new to R and currently going through the following situation.
>   I have run a R code on two different machines. I have R 2.12 installed on both.
>   Desktop 1 is slightly older and has a dual core processor with 4gigs of RAM. Desktop 2 is newer one and has a xeon processor W3505 with 12gigs of RAM. Both run on Windows 7.
>   I don't really see any significant speed up in the newer computer (Desktop 2). In the older one the program took around 5hrs 15 mins and in the newer one it took almost 4hrs 30mins.
>  In the newer dekstop, R gives me the following:
>> memory.limit()
> [1] 1024
>> memory.size()
> [1] 20.03
>  Is something hampering me here? Do I need to increase the limit and size? Can this change be made permanent? Or am I looking at the wrong place?
>  I have never seen my R programs using much CPU or RAM when it runs? If this is not something inherent to R, then I guess I need to write more effiecient codes.
>  Suggestions/solutions are welcome.
>   Thanks,
>   -Debs
> ______________________________________________
> mailing list
> PLEASE do read the posting guide
> and provide commented, minimal, self-contained, reproducible code.

Jim Holtman
Data Munger Guru

What is the problem that you are trying to solve?

______________________________________________ mailing list
PLEASE do read the posting guide
and provide commented, minimal, self-contained, reproducible code.
Received on Fri 27 May 2011 - 19:30:33 GMT

This quarter's messages: by month, or sorted: [ by date ] [ by thread ] [ by subject ] [ by author ]

All messages

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.2.0, at Fri 27 May 2011 - 19:40:10 GMT.

Mailing list information is available at Please read the posting guide before posting to the list.

list of date sections of archive