Re: R-alpha: Memory exhausted

Douglas Bates (bates@stat.wisc.edu)
Fri, 15 Nov 1996 07:14:57 -0600 (CST)


Message-Id: <m0vOO6j-000hlvC@franz.stat.wisc.edu>
Date: Fri, 15 Nov 1996 07:14:57 -0600 (CST)
From: Douglas Bates <bates@stat.wisc.edu>
To: steuer@gigamain.Statistik.Uni-Dortmund.DE (Detlef Steuer)
Subject: Re: R-alpha: Memory exhausted
In-Reply-To: <9611151137.AA15255@gigamain.Statistik.Uni-Dortmund.DE>

>>>>> "Detlef" == Detlef Steuer <steuer@gigamain.Statistik.Uni-Dortmund.DE> writes:

  Detlef> I'm running R-013 on a SUN. Everything seems to work fine.
  Detlef> My problem are 210432 data points from a time series (every
  Detlef> 15min, 6 years).

  Detlef> When I try to load them in memory I get "memory exhausted".
  Detlef> "top" says something like 100Mb swap and 10 Mb real memory
  Detlef> still free.  My data sums up around 2Mb.

  Detlef> How can I increase the memory R uses? Or is it a bug?

I haven't checked the way that functions like scan() and read.table()
work internally in R but I imagine it is similar to the way that they
work in S where the emphasis is on flexibility at the expense of using
storage.  I think the amount of storage used is proportional to the
square of the number of objects being scanned or something like that.

The simplest way out of this explosive data growth problem is to cut
your original file into several pieces (20 pieces of 10000 or so
observations might work nicely), scan each of the pieces, and
concatenate the results after you are done.
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
r-testers mailing list -- To (un)subscribe, send
subscribe	or	unsubscribe
(in the "body", not the subject !)  To: r-testers-request@stat.math.ethz.ch
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-