RE: [R] Performance problem

From: Stephan Moratti <>
Date: Tue 20 Jul 2004 - 21:03:04 EST

>Precedence: list
>MIME-Version: 1.0
>Date: Tue, 20 Jul 2004 11:23:25 +0200
>Message-ID: <>
>Content-Type: text/plain; charset=us-ascii
>Subject: [R] Performance problem
>Message: 60
>Dear all,
>I have a performance problem in terms of computing time.
>I estimate mixed models on a fairly large number of subgroups (10000) using
>lme(.) within the by(.) function and it takes hours to do the calculation
>on a fast notebook under Windows.
>I suspect by(.) to be a poor implementation for doing individual analysis
>on subgroups.
>Is there an alternative and more efficient way for doing by-group
>processing within lme(.).
>Here some code to give you a glimpse
>gfit <- by(longdata, gen, function(x) lme(fixed=response ~ dye + C(treat,
>base = 4 ),
> data=x,random =~ 1 | slide) )
>Thanks in advance & regards
>Gerhard Krennrich

Sorry, that I can't contribute to a solution. But I have a similar problem, doing lme's on 350 source estimations of MEG brain data. So if somebody knows some improvement, please let me know !

Stephan Moratti

Dipl. Psych. Stephan Moratti
Dept. of Psychology
University of Konstanz
P.O Box D25
Phone: +40 (0)7531 882385
Fax: +49 (0)7531 884601
D-78457 Konstanz, Germany

e-mail: mailing list PLEASE do read the posting guide! Received on Tue Jul 20 21:12:03 2004

This archive was generated by hypermail 2.1.8 : Fri 18 Mar 2005 - 02:37:13 EST