Re: [Rd] AIX testers needed

From: Jason Barnhart <jasoncbarnhart_at_msn.com>
Date: Mon, 21 May 2007 11:55:52 -0700

Thanks for responding.

I don't think it's that simple. That's a soft limit, the hard limit is "unlimited."

The results of gc() in the original post indicated that R could utililize more than 32MB of RAM.

My sysadmin had already increased my memory limits prior to my posting.

Just to confirm here are the results with ulimit -m set to unlimited prior to calling R.

> xx <- matrix(rep(1e+10,1e7),nrow=1e4,ncol=1e3)
> object.size(xx)/1024^2
[1] 76.29405
> system("ulimit -m")

unlimited
> tmp.df <- as.data.frame(cbind(xx,xx,xx))
Error: cannot allocate vector of size 228.9 Mb

> Jason Barnhart wrote:
>> Thank you for responding.
>>
>> I should have added -a on my ulimit command. Here are its results;
>> which I believe are not the limiting factor.
>>
>> %/ > ulimit -a
>> core file size (blocks, -c) 1048575
>> data seg size (kbytes, -d) unlimited
>> file size (blocks, -f) unlimited
>> max memory size (kbytes, -m) 32768
>> open files (-n) 2000
>> pipe size (512 bytes, -p) 64
>> stack size (kbytes, -s) hard
>> cpu time (seconds, -t) unlimited
>> max user processes (-u) 128
>> virtual memory (kbytes, -v) unlimited
>
> you think max memory = 32768k (or 32MB) is not limiting?
> Please think again...
>
> HTL
>



R-devel_at_r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel Received on Mon 21 May 2007 - 19:05:06 GMT

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.2.0, at Mon 21 May 2007 - 21:34:09 GMT.

Mailing list information is available at https://stat.ethz.ch/mailman/listinfo/r-devel. Please read the posting guide before posting to the list.