Re: [R] increasing memory

About this list Date view Thread view Subject view Author view Attachment view

From: Prof Brian Ripley (ripley@stats.ox.ac.uk)
Date: Wed 05 May 2004 - 16:05:53 EST


Message-id: <Pine.LNX.4.44.0405050700430.9946-100000@gannet.stats>

The commands do matter. Both ?read.table and the Data Import/Export
Manual tell you ways to speed up reading a table.

However, there seems to be a problem with either MacOS X or perhaps your
hardware that is probably impossible to diagnose remotely. A Unix system
should be able to kill any process you own with kill -9 (unless it is out
of other resources, e.g. processes to run kill): that's not an R issue.

On Tue, 4 May 2004, Janet Rosenbaum wrote:

>
>
> > If it actually crashes there is a bug, but I suspect that it stops with an
> > error message -- please do read the posting guide and tell us exactly what
> > happens.
>
> Sorry, I hadn't realized that "crash" means to give an error message on
> this mailing list.
>
> To me, "crash" means that the computer freezes entirely, or if I'm
> lucky it just runs for several hours without doing anything, and the
> process can't even be killed with -9, and the computer can't be
> shutdown, but has to be powercycled.
>
> For instance, I left it doing a read.table on a text format file from this
> data (a few hundred megs) and eight hours later it was still "going".
> I watched the process with "top" for awhile and the computer had plenty
> of free memory -- over 100 M this whole time, and R was using almost no
> CPU.

It may still have been swapping.

> I have tried all sorts of ways of reading in the data. It's best if I
> can read the xport file since that has all the original labels which
> don't get to the text file, but read.xport actually freezes the
> computer.
>
> As I said, I am running R 1.8.1 which claims to be the most recent
> version (when I type is.RAqua.updated()) on an ibook G3/800 with 620 M
> RAM (the maximum) running 10.3.3.
>
> The command really doesn't much matter. These are totally normal files
> and I can load in the normal sized files with the exact same
> commands.
> > w<-read.table("pedagogue.csv",header=T, sep=",")
> > library(foreign)
> > w<-read.xport("demagogue.xpt")
>
> The xpt files are up to 400 M, and the csv files are about 100 M.
>
> Janet
>

-- 
Brian D. Ripley,                  ripley@stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford,             Tel:  +44 1865 272861 (self)
1 South Parks Road,                     +44 1865 272866 (PA)
Oxford OX1 3TG, UK                Fax:  +44 1865 272595

______________________________________________ R-help@stat.math.ethz.ch mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


About this list Date view Thread view Subject view Author view Attachment view

This archive was generated by hypermail 2.1.3 : Mon 31 May 2004 - 23:05:07 EST