R-alpha: Memory

bent@stat.ubc.ca
Tue, 21 May 1996 09:37:58 -0700 (PDT)


From: bent@stat.ubc.ca
Message-Id: <199605211637.JAA28247@fisher.stat.ubc.ca>
Subject: R-alpha: Memory
To: r-testers@stat.math.ethz.ch (r- testers)
Date: Tue, 21 May 1996 09:37:58 -0700 (PDT)

>From S-NEWS@utstat.toronto.edu

The message below is from the S-Plus list. The first message is a reply to
the second. I wonder how R would cope with such large data. I believe that
GLIM does not form the design matrix explicitly, and I thought this 
would now be standard, but maybe not?  
    Bent
> 
> while not having a solution to this problem, i have experienced it several
> times. seems to me that this is a big drawback with Splus. while not
> having a good understanding of how the "lm" procedure works, it seems like
> it must be storing design matrices in full, rather than using sparse
> matrix storage techniques? the answer should not be to buy more RAM, but
> to code "lm" more efficiently. has anyone done this?
> 
> a little while ago, i tried to run a "varcomp" on a dataset with 60,000 
> observations, and two random effects with about 120 levels in total. 
> splus (3.3) bombed out on a sun sparc20 with 64 megs of ram. using 
> another program specifically written for variance component estimation, 
> the model ran on a linux pc with 16 megs, running x-windows at the same 
> time. i know this is a comparison of the apples v. oranges kind, but i 
> reckon splus should be able to do this as well.
> 
> Andrew Swan
> CSIRO Division of Animal Production
> Pastoral Research Laboratory
> Armidale  2350  AUSTRALIA
> ph.  +61 (0)67 761377
> fax  +61 (0)67 761371
> email: andrew@ovis.chiswick.anprod.csiro.au
> 
> 
> On Mon, 20 May 1996 sarah.parish@ctsu.ox.ac.uk wrote:
> 
> > 
> > I am having memory problems trying to do various regressions on a Sun with 
> > 48Mb ram.  As a start, I am trying to fit an lm to 17065
> > observations, regressing on a factor variable with 107 values. 
> > Splus demands a huge amount of memory (see below).  Is there anything I can do
> > to get round this?  I want to proceed to add further terms, probably another 30
> > effects.  Can anyone give me an idea of how much memory I would need to install
> > on the Sun to do this - or should I be using some other software? 
> > 
> > > options(object.size=20000000)
> > > lmch<-lm(chol ~ dachol+cc,data=lipdat)
> > Error in .Fortran("dqrls",: Unable to obtain requested dynamic memory 
> > (this request is for 15351616 bytes, 68056560 bytes already in use)
> > 
> > Thanks
> > 
> > Sarah
> > 
> 


-- 
   Bent Jorgensen                    Voice: (604) 822-3167
   Department of Statistics          Fax:   (604) 822-6960
   University of British Columbia    E-mail: bent@stat.ubc.ca
   333-6356 Agricultural Road        ftp: newton.stat.ubc.ca
   Vancouver B.C. Canada V6T 1Z2     
 
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
r-testers mailing list -- To (un)subscribe, send
subscribe	or	unsubscribe
(in the "body", not the subject !)  To: r-testers-request@stat.math.ethz.ch
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-