Re: [R] memory tops out at 1.84gb on OS X 10.4 machine w/ 5GB ram

From: Albert Vilella <>
Date: Fri 16 Dec 2005 - 02:36:41 EST

El dj 15 de 12 del 2005 a les 08:17 -0500, en/na Roger D. Peng va escriure:
> I'm not completely sure, but I don't think OS X is at the point yet where it can
> access > 2GB of memory (like, for example, Linux on Opteron). More
> specifically, I'm not sure a single process image can access > 2GB of memory,
> but I'd welcome any corrections to that statement. To be sure, this problem is
> not an issue with R because R has regularly been reported to access u> 4GB of
> memory when the OS allows it.

I may seen somewhere that OSX has a "per process" limit of ~1.5GB max RAM, I think that WinXP has a limit on 2GB RAM,

In Linux, I believe a process can take ~4GB, more on x86-64 platforms either directly or tweaking some option in the kernel,

Not sure though, I'm pretty sure someone else can give a clearer picture on this,


> -roger
> Ken Termiso wrote:
> > Hi all,
> >
> > Sorry if this is a dumb question, but I am on 10.4 with R2.2, and when
> > loading a big text file (~500MB) with scan(file, what=character) I am
> > throwing malloc errors that say I am out of memory...I have 5GB on this
> > machine, and Activity Monitor tells me R is only up to ~1.84GB both times
> > this has happened (running from terminal)...
> >
> > I am wondering why this is happening when I still have >2GB of free memory
> > waiting to be used...?
> >
> > Any advice would be much obliged,
> > Ken
> >
> > ______________________________________________
> > mailing list
> >
> > PLEASE do read the posting guide!
> >
> mailing list PLEASE do read the posting guide! Received on Fri Dec 16 02:57:11 2005

This archive was generated by hypermail 2.1.8 : Fri 03 Mar 2006 - 03:41:39 EST