Re: [R] memory problem

From: Duncan Murdoch <murdoch_at_stats.uwo.ca>
Date: Thu 14 Jul 2005 - 22:16:48 EST

On 7/14/2005 7:19 AM, Ginters wrote:
> I'm a beginner in R and, therefore, I don't know how serious my trouble is.
> After running a script:
>
> **
>
> *t**<-c(14598417794,649693)*
>
> *data**=data.frame(read.spss("C:\\Ginters\\Kalibracija\\cal_data.sav"))*
>
> *Xs=**as.matrix(data[,1:2])
> *
>
> *koef**=data.frame(read.spss("C:\\Ginters\\Kalibracija\\f.sav"))
> *
>
> *piks=**as.matrix(koef[,1**])*
>
> *g=regressionestimator(Xs,piks,t)*
>
> I get:
>
> *Error: cannot allocate vector of size 1614604 Kb
> In addition: Warning messages:
> 1: Reached total allocation of 255Mb: see help(memory.size)
> 2: Reached total allocation of 255Mb: see help(memory.size) *
>
> My OS is Win 2000 Proffesional.
> Those 2 objects are of sizes
>
> *> object.size(Xs)
> [1] 805404

>> object.size(piks)

> [1] 115128*
>
> accordingly. The 2 files use only 142KB and 60KB accordingly.
> Why does memory need so much (1.6 GB) space? How can I enlarge it? Is it
> possible to allocate a part of memory used to the hard drive? Or, is the
> trouble only with my script?

This sounds like a problem with the regressionestimator function, which I think comes from the sampling package. You'll need to contact the maintainer of the package to find out why it needs so much memory, and whether there's a way to get what you want without it.

Duncan Murdoch



R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html Received on Thu Jul 14 22:21:31 2005

This archive was generated by hypermail 2.1.8 : Fri 03 Mar 2006 - 03:33:40 EST