Re: [R] memory issues

From: <>
Date: Wed, 16 Apr 2008 18:57:41 -0400

Oddly enough, the variogram modelling is rather quick in Surfer, but one cannot compute the standard errors. I restricted the search to the approximate the range of the variogram model (2000m). I can get R to compute with 12079 observations, but 13453 and I run into the gstat error message.

I guess it's really down to 1) trying to reduce the original dataset 2) adding 1 to 2 Gb of RAM
3) running the predictions on a Linux machine with suitable RAM.

I couldn't reproduce the memory.limit() issue. I'll chalk that up to errant typing on my part.

Quoting Rubén Roa-Ureta <>:

> I think any geostatistical program/R package would have trouble
> handling 12000 observations on a PC. The empirical variogram would be
> built with the combinations of 12000 over 2 pairs, nearly 72 millions
> pairs, and during kriging, if you didn't restrict the search
> neighbourhood, interpolation would involve solving something very big,
> more so if you defined a fine grid, ... Try restricting the search
> neighbourhood, if you didn't, with maxdist and nmax.
> Rubén
> Prof Brian Ripley wrote:
>> I think the clue is that the message you quote comes from gstat,
>> which does not use R's memory allocator. It is gstat and not R
>> that has failed to allocate memory.
>> Try re-reading the help page for memory.size. 'max=T' does not
>> indicate the limit (that is the job of memory.limit()), but the
>> maximum 'used' (acquired from the OS) in that session. So 19Mb
>> looks reasonable for R's usage.
>> I don't understand the message from memory.limit() (and the
>> formatting is odd). memory.limit() does not call max() (it is
>> entirely internal), so I wonder if that really is the output from
>> that command. (If you can reproduce it, please let us have precise
>> reproduction instructions.)
>> There isn't much point in increasing the memory limit from the
>> default 1.5Gb on a 2Gb XP machine. The problem is that the user
>> address space limit is 2Gb and fragmentation means that you are
>> unlikely to be able to get over 1.5Gb unless you have very many
>> small objects, in which case R will run very slowly. In any case,
>> that is not the issue here.
>> On Wed, 16 Apr 2008, Dave Depew wrote:
>>> Hi all,
>>> I've read the R for windows FAQ and am a little confused re:
>>> memory.limit and memory.size
>>> to start using R 2.6.2 on WinXP, 2GB RAM, I have the command line "sdi
>>> --max-mem-size=2047M"
>>> Once the Rgui is open, memory.limit() returns 2047, memory.size()
>>> returns 11.315, and memory.size(max=T) returns 19.615
>>> Shouldn't memory.size(max=T) return 2047?
>>> Upon running several operations involving kriging (gstat package,
>>> original data file 3 variables, 12000 observations)
>>> the program runs out of memory
>>> "memory.c", line 57: can't allocate memory in function m_get()
>>> Error in predict.gstat(, newdata = EcoSAV.grid.clip.spxdf,
>>> debug.level = -2, :
>>> m_get
>>> Immediately following this,
>>> memory.limit() returns [1] -Inf
>>> Warning message:
>>> In memory.limit() : no non-missing arguments
>>> to max; returning -Inf
>>> memory.size() returns 24.573.
>>> memory.size(max=T) returns 46.75

>>> To my untrained eye, it appears that R is not being allowed access to
>>> the full memory limit specified in the cmd line....if this is the case,
>>> how does one ensure that R is getting access to the full allotment of RAM?
>>> Any insight is appreciated...
>>>> sessionInfo()
>>> R version 2.6.2 (2008-02-08)
>>> i386-pc-mingw32
>>> locale:
>>> LC_COLLATE=English_United States.1252;LC_CTYPE=English_United
>>> States.1252;LC_MONETARY=English_United
>>> States.1252;LC_NUMERIC=C;LC_TIME=English_United States.1252
>>> attached base packages:
>>> [1] stats graphics grDevices datasets tcltk utils
>>> methods base
>>> other attached packages:
>>> [1] maptools_0.7-7 foreign_0.8-23 gstat_0.9-43 rgdal_0.5-24
>>> lattice_0.17-4 sp_0.9-23 svSocket_0.9-5 svIO_0.9-5
>>> R2HTML_1.58 svMisc_0.9-5 svIDE_0.9-5
>>> loaded via a namespace (and not attached):
>>> [1] grid_2.6.2 tools_2.6.2
>>> ______________________________________________
>>> mailing list
>>> PLEASE do read the posting guide
>>> and provide commented, minimal, self-contained, reproducible code.
>> mailing list PLEASE do read the posting guide and provide commented, minimal, self-contained, reproducible code. Received on Thu 17 Apr 2008 - 00:24:18 GMT

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.2.0, at Thu 17 Apr 2008 - 00:30:29 GMT.

Mailing list information is available at Please read the posting guide before posting to the list.

list of date sections of archive