[Rd] setTimeLimit() fails when out of memory.

From: Jeroen Ooms <jeroenooms_at_gmail.com>
Date: Fri, 22 Jan 2010 14:14:27 -0800 (PST)

I am using the function setTimeLimit(elapsed=60) in R web applications to prevent processes from taking up too many resources. This works fine when a calculation takes too long, however, when the machine runs out of memory, the time limit also seems to fail, resulting in a stuck process maximum memory until killed manually. This probably makes sense in some way, but I was wondering if there is a solution. Some example code using ggplot2 that crashes my machine:

#time limit works fine:

setTimeLimit(elapsed=10);
while(TRUE) {sqrt(pi)}
setTimeLimit();

#time limit fails:

library(ggplot2);
x <- rnorm(1e6);
y <- rnorm(1e6);
setTimeLimit(elapsed=10);
qplot(x,y,geom="density2d");
setTimeLimit();

Jeroen Ooms
www.stat.ucla.edu/~jeroen

-- 
View this message in context: http://n4.nabble.com/setTimeLimit-fails-when-out-of-memory-tp1129595p1129595.html
Sent from the R devel mailing list archive at Nabble.com.

______________________________________________
R-devel_at_r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
Received on Sat 23 Jan 2010 - 14:28:54 GMT

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.2.0, at Sat 23 Jan 2010 - 18:30:16 GMT.

Mailing list information is available at https://stat.ethz.ch/mailman/listinfo/r-devel. Please read the posting guide before posting to the list.

list of date sections of archive