[R] help track a segmentation fault

From: Omar Lakkis <uofiowa_at_gmail.com>
Date: Thu 05 May 2005 - 02:51:55 EST


I have an R script that I run using

nohup R CMD BATCH r.in r.out &

The code loops through data from the database and takes hours. The problem is, in about an hour and a half after I start the script the program stops and I get

/usr/lib/R/bin/BATCH: line 55: 14067 Done                    ( echo
"invisible(options(echo = TRUE))"; cat ${in}; echo "proc.time()" )
     14068 Segmentation fault      | ${R_HOME}/bin/R ${opts} >${out} 2>&1

in the nohup.out file.
If I run the code from within R, not using CMD BATCH, R sig faults after the hour and a half (roughly).
I monitored the process using "top" and found nothing usual, the mem utilization is around 15% and CPU time in the 90s%. I do not see a steady increase in mem usage signaling memory leak. I have a core dump file generated but I do not know what to do with it. Can someone, please, suggest to me what I can do to track this problem, probably using the -d flag with R, which I do not know how to use.
I am running R 2.1.0 on debian.



R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html Received on Thu May 05 03:32:28 2005

This archive was generated by hypermail 2.1.8 : Fri 03 Mar 2006 - 03:31:35 EST