Re: [R] about memory

From: jon butchar <butchar.2_at_osu.edu>
Date: Thu 31 Mar 2005 - 03:56:36 EST

Yes, you may need more memory unless you can somehow free a good amount of RAM or find a more memory-efficient method for clustering. If I'm reading it correctly, R wanted to allocate about 382 MB memory on top of what it had already taken but your computer had only about 98 MB swap plus about 1 MB RAM left to give.

On Wed, 30 Mar 2005 22:02:04 +0800
ronggui <0034058@fudan.edu.cn> wrote:

> root@2[ronggui]# ulimit -a
> core file size (blocks, -c) 0
> data seg size (kbytes, -d) unlimited
> file size (blocks, -f) unlimited
> max locked memory (kbytes, -l) unlimited
> max memory size (kbytes, -m) unlimited
> open files (-n) 1024
> pipe size (512 bytes, -p) 8
> stack size (kbytes, -s) 8192
> cpu time (seconds, -t) unlimited
> max user processes (-u) unlimited
> virtual memory (kbytes, -v) unlimited
>
> so it seems the data segment size is not limited.
> and it is still free mem(1000k or so),and swap(100000k or so),and the error is(i translate it from chinese into english,maybe not exactly ,but i think the meanings are right):
> error:can not allocate the vector size of 390585kb.
> (错误: 无法分配大小为390585 Kb的向量)



R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html Received on Thu Mar 31 04:04:39 2005

This archive was generated by hypermail 2.1.8 : Fri 03 Mar 2006 - 03:30:57 EST