Re: [Rd] External pointers and an apparent memory leak

From: Simon Urbanek <simon.urbanek_at_r-project.org>
Date: Thu, 15 Sep 2011 11:35:55 -0400

Jim,

On Sep 14, 2011, at 5:21 PM, James Bullard wrote:

> I'm using external pointers and seemingly leaking memory. My determination of a memory leak is that the R process continually creeps up in memory as seen by top while the usage as reported by gc() stays flat. I have isolated the C code:
>
> void h5R_allocate_finalizer(SEXP eptr) {
> Rprintf("Calling the finalizer\n");
> void* vector = R_ExternalPtrAddr(eptr);
> free(vector);
> R_ClearExternalPtr(eptr);
> }
>
> SEXP h5R_allocate(SEXP size) {
> int i = INTEGER(size)[0];
> char* vector = (char*) malloc(i*sizeof(char));
> SEXP e_ptr = R_MakeExternalPtr(vector, R_NilValue, R_NilValue);
> R_RegisterCFinalizerEx(e_ptr, h5R_allocate_finalizer, TRUE);
> return e_ptr;
> }
>
>
> If I run an R program like this:

>
> v <- replicate(100000, {
> .Call("h5R_allocate", as.integer(1000000))
> })
> rm(v)
> gc()
>

This seems a little optimistic to me - at least on the machines most mortals have - since it will allocate ~93GB of memory - before rm/gc:

vmmap:

                                 VIRTUAL ALLOCATION      BYTES
MALLOC ZONE                         SIZE      COUNT  ALLOCATED  % FULL
===========                      =======  =========  =========  ======
DefaultMallocZone_0x1004cf000      62.8M     120044      93.5G    152363%
environ_0x100601000                1024K         27       1280      0%
===========                      =======  =========  =========  ======
TOTAL                              63.8M     120071      93.5G    149977%

ps:

  UID   PID  PPID CPU PRI NI      VSZ    RSS WCHAN  STAT   TT       TIME COMMAND
  501 26287 26170   0  31  0 100511220  64864 -      S+   s002    1:06.81 /Library/Frameworks/R.framework/Resources/bin/exec/x86_64/R

fortunately it's never used, so it's actually possible (purely virtual). But as Matt said, it gets released without problems - after rm/gc:

> gc()

         used (Mb) gc trigger (Mb) max used (Mb)
Ncells 433341 23.2     667722 35.7   597831 32.0
Vcells 630031 4.9 1300721 10.0 1211088 9.3
                                 VIRTUAL ALLOCATION      BYTES
MALLOC ZONE                         SIZE      COUNT  ALLOCATED  % FULL
===========                      =======  =========  =========  ======
DefaultMallocZone_0x1004cf000      59.3M      19083      36.6M     61%
environ_0x100601000                1024K         27       1280      0%
===========                      =======  =========  =========  ======
TOTAL                              60.3M      19110      36.6M     60%

  501 26287 26170   0  31  0  2522872  60880 -      S+   s002    1:35.69 /Library/Frameworks/R.framework/Resources/bin/exec/x86_64/R

> sessionInfo()

R version 2.13.1 (2011-07-08)
Platform: x86_64-apple-darwin9.8.0/x86_64 (64-bit)

locale:
[1] en_US.UTF-8/en_US.UTF-8/C/C/en_US.UTF-8/en_US.UTF-8

attached base packages:
[1] stats graphics grDevices utils datasets methods base

> Then you can see the problem (top reports that R still has a bunch of memory, but R doesn't think it does). I have tried using valgrind and it says I have memory left on the table at the end lest you think it is because top. Also, I have tried Free/Calloc as well and this doesn't make a difference. Finally, I see this in both R-2-12 (patched) and R-2-13 - I think it is more an understanding issue on my part.
>

You didn't mention your OS - some OSes do not release memory immediately (some wait until you try to allocate new memory) and some can't release certain type of memory at all. Also depending on your OS allocation library you can get more info about the allocation pool to understand what is going on. But for that you'd have to share with us the platform info ...

> thanks much in advance, to me it really resembles the connection.c code, but what am I missing?
>

Cheers,
Simon

PS: This has nothing to do with your question but I'd suggest checking the result on malloc [e.g., if (!vector) Rf_error("unable to allocate %d bytes", i); Also i = asInteger(size) is much more safe (and convenient) than i = INTEGER(size)[0] and completely irrelevantly as.integer(1000000) is more efficiently written as 1000000L.

> thanks, jim
>
>
> [[alternative HTML version deleted]]
>
> ______________________________________________
> R-devel_at_r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>
>



R-devel_at_r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel Received on Thu 15 Sep 2011 - 15:38:38 GMT

This quarter's messages: by month, or sorted: [ by date ] [ by thread ] [ by subject ] [ by author ]

All messages

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.2.0, at Thu 15 Sep 2011 - 19:50:31 GMT.

Mailing list information is available at https://stat.ethz.ch/mailman/listinfo/r-devel. Please read the posting guide before posting to the list.

list of date sections of archive