Fw: [R] Another big data size problem

From: Federico Gherardini <f.gherardini_at_pigrecodata.net>
Date: Wed 28 Jul 2004 - 21:40:58 EST


On Wed, 28 Jul 2004 09:53:08 +0200
Uwe Ligges <ligges@statistik.uni-dortmund.de> wrote:

>
> If your data is numeric, you will need roughly
>
> 1220 * 20000 * 8 / 1024 / 1024 ~~ 200 MB
>
> just to store one copy in memory. If you need more than two copies, your
> machine with its 512MB will start to use swap space .....
> Hence either use a machine with more memory, or don't use all the data
> at once in memory, e.g. by making use of a database.
>
> Uwe Ligges
>

Well I'd be happy if it used swap space instead of locking itself up! By the way I don't think that the problem is entirely related to memory consumption. I have written a little function that reads the data row by row and does a print each time, to monitor its functioning. Everything starts to crwal to an horrible slowness long before my memory is exhausted... i.e.: after about 100 lines. It seems like R has problems managing very large objects per se? By the way I'll try to upgrade to 1.9 and see what happens...

Ernesto Jardim wrote:

>Hi,
>
>It looks like you're running linux !? if so it will be quite easy to
>create a table in MySQL, upload all the data into the database and
>access the data with RMySQL (it's _very_ fast). Probably there will be
>some operations that you can do on MySQL instead of "eating" memory in
>R.

>Regards

>EJ

I'll give that a try.

Thanks everybody for their time

fede



R-help@stat.math.ethz.ch mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html Received on Wed Jul 28 19:48:19 2004

This archive was generated by hypermail 2.1.8 : Fri 18 Mar 2005 - 02:40:27 EST