Re: [R] R 2.1.0 RH Linux Built from Source Segmentation Fault

From: Peter Dalgaard <>
Date: Fri 20 May 2005 - 17:04:08 EST

Bruce Foster <> writes:

> The machines are AMD Athlon MP 2400+ with 2 GB RAM, dual CPUs, and
> lots of free disk space.

Any per-user/per-process limits? Resource usage look suspiciously close to 256M. If your install is allowing overcommitment of memory, the OS can kill processes at unpredictable times.

> I've got a user running Monte Carlo codes that fail with segmentation
> faults on a frequent basis. The jobs run for a long time (up to a day)
> before failure.
> If a failed job is rerun, chances are high that it will run to completion.
> I'm at a loss about approaching this problem. R (as it is here)
> doesn't seem to give much of a hint as to where things are when it
> crashes.
> I'm looking for some guidance to diagnose this problem so we can focus
> on a solution.

(A) Use set.seed(...) to get a fixed sequence of random numbers. If it still fails unpredictably, my bet is that it is a resource problem.

(B) Once you have a case that fails predictably, run it under a debugger and try to backtrack to the point of failure. There are various debugging tricks that you can use, but just get there first and show us a stack backtrace at the failure point (bt command in gdb).

For more detailed guidance you should probably move the discussion to the r-devel list.

   O__  ---- Peter Dalgaard             Blegdamsvej 3  
  c/ /'_ --- Dept. of Biostatistics     2200 Cph. N   
 (*) \(*) -- University of Copenhagen   Denmark      Ph: (+45) 35327918
~~~~~~~~~~ - (             FAX: (+45) 35327907

______________________________________________ mailing list
PLEASE do read the posting guide!
Received on Fri May 20 17:09:23 2005

This archive was generated by hypermail 2.1.8 : Fri 03 Mar 2006 - 03:31:56 EST