[R] Another big data size problem

From: Federico Gherardini <f.gherardini_at_pigrecodata.net>
Date: Wed 28 Jul 2004 - 12:10:31 EST


Hi all,

I'm trying to read a 1220 * 20000 table in R but I'm having lot of problems. Basically what it happens is that R.bin starts eating all my memory until it gets about 90%. At that point it locks itself in a uninterruptible sleep status (at least that's what top says) where it just sits there barely using the cpu at all but keeping its tons of memory. I've tried with read.table and scan but none of them did the trick. I've also tried some orrible hack like reading one line a time and gradually combining everything in a matrix using rbind... nope! It seems I can read up to 500 lines in a *decent* time but nothing more. The machine is a 3 GHz P4 with HT and 512 MB RAM running R-1.8.1. Will I have to write a little a C program myself to handle this thing or am I missing something?

Thanks in advance for your help,

fede



R-help@stat.math.ethz.ch mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html Received on Wed Jul 28 10:16:59 2004

This archive was generated by hypermail 2.1.8 : Fri 18 Mar 2005 - 02:40:22 EST