Re: [R] Another big data size problem

From: Christian Schulz <ozric_at_web.de>
Date: Wed 28 Jul 2004 - 21:40:19 EST

Hi,

i'm working with a ~ 250.000 * 150 data.frame and can share your problems - i've upgraded last weekend my notebook from 512MB -> 1024MB, it's really better especially for load, write.table , mysqlReadTable, mysqlWriteTable, because machine begin caching if RAM is full. One example:
With 512MB i get after some hours no success write a table to mysql. With 1024MB it does in some minutes.

regards, christian

Am Mittwoch, 28. Juli 2004 04:10 schrieb Federico Gherardini:
> Hi all,
>
> I'm trying to read a 1220 * 20000 table in R but I'm having lot of
> problems. Basically what it happens is that R.bin starts eating all my
> memory until it gets about 90%. At that point it locks itself in a
> uninterruptible sleep status (at least that's what top says) where it just
> sits there barely using the cpu at all but keeping its tons of memory. I've
> tried with read.table and scan but none of them did the trick. I've also
> tried some orrible hack like reading one line a time and gradually
> combining everything in a matrix using rbind... nope! It seems I can read
> up to 500 lines in a *decent* time but nothing more. The machine is a 3 GHz
> P4 with HT and 512 MB RAM running R-1.8.1. Will I have to write a little a
> C program myself to handle this thing or am I missing something?
>
> Thanks in advance for your help,
>
> fede
>
> ______________________________________________
> R-help@stat.math.ethz.ch mailing list
> https://www.stat.math.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide!
> http://www.R-project.org/posting-guide.html



R-help@stat.math.ethz.ch mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html Received on Wed Jul 28 21:50:04 2004

This archive was generated by hypermail 2.1.8 : Fri 18 Mar 2005 - 02:40:28 EST