[R] Gaussian frailty leads to segmentation fault

From: Christian Lederer <christian.lederer_at_imse.med.tu-muenchen.de>
Date: Thu 29 Jul 2004 - 02:38:01 EST

Dear R gurus,

for a simulation concerning study effects and historical controls in survival analysis, i would like to experiment with a gaussian frailty model.

The simulated scenario consists of a randomized trial (treatment and placebo) and historical controls (only placebo).

So the simulated data frames consist of four columns
$time, $cens, $study, $treat.
$time, $cens are the usual survival data.
For the binary thretment indicator we have
$treat == 0 or 1, if $study == 1,
$treat == 1 if $study > 1

Typical parameters for my simulations are: sample sizes (per arm): between 100 and 200 number of historical studies: between 7 and 15 hazard ratio treatment/placebo: between 0.7 and 1 variance of the study effekt: between 0 and 0.3

Depending on the sample sizes, the following call sometimes leads to a segmentation fault:

coxph(Surv(time,cens) ~

       as.factor(treatment) + frailty(study, distribution="gaussian"),

I noticed, that this segmentation fault occures most frequently, if the number of randomized treatment patients is higher than the number of randomized placebo patients, and the number of historical studies is large.
There seems to be no problem, if there are at least as many randomized placebo patients as treated patients. Unfortunately, this is not the situation i want to investigate (historical controls should be used to decrease the number of treated patients).

Is there a way to circumwent this problem?


Is it allowed, to attach gzipped sample data sets in this mailing list?

R-help@stat.math.ethz.ch mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html Received on Thu Jul 29 01:48:45 2004

This archive was generated by hypermail 2.1.8 : Fri 18 Mar 2005 - 02:40:54 EST