Re: [R] memory tops out at 1.84gb on OS X 10.4 machine w/ 5GB ram

From: David Ruau <druau_at_ukaachen.de>
Date: Thu 15 Dec 2005 - 20:40:04 EST

Hi,
I don't know why, but I have a workaround maybe: You can load sequentially the file. Split the text file in 2 or 3 and re-associate the vector/list into r after. Once I was using a similar technic to write a huge matrix into a txt file.

David

On Dec 14, 2005, at 21:47, Ken Termiso wrote:

> Hi all,
>
> Sorry if this is a dumb question, but I am on 10.4 with R2.2, and when
> loading a big text file (~500MB) with scan(file, what=character) I am
> throwing malloc errors that say I am out of memory...I have 5GB on this
> machine, and Activity Monitor tells me R is only up to ~1.84GB both
> times
> this has happened (running from terminal)...
>
> I am wondering why this is happening when I still have >2GB of free
> memory
> waiting to be used...?
>
> Any advice would be much obliged,
> Ken
>
> ______________________________________________
> R-help@stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide!
> http://www.R-project.org/posting-guide.html
>



R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html Received on Thu Dec 15 20:46:31 2005

This archive was generated by hypermail 2.1.8 : Thu 15 Dec 2005 - 23:41:49 EST