Re: [R] problem

From: David Winsemius <>
Date: Thu, 06 Mar 2008 05:00:17 +0000 (UTC)

Philipp Pagel <> wrote in news:20080305120637.GA8181_at_localhost:

> On Wed, Mar 05, 2008 at 12:32:19PM +0100, Erika Frigo wrote:
>> My file has not only more than a million values, but more than a
>> million rows and moreless 30 columns (it is a productive dataset
>> for cows), infact with read.table i'm not able to import it.
>> It is an xls file.

There is something very wrong here. Even the most recent versions of Excel cannot handle files with a million rows. Heck, they can't even handle files with one-tenth than number. In earlier versions the limit was on the order of 36K.

David Winsemius

> read.table() expects clear text -- e.g. csv or tab separated in the
> case of read.delim(). If your file is in xls format the simplest
> option would be to export the data to CSV format from Excel.
> If for some reason that is not an option please have a look at the
> "R Data Import/Export" manual.
> Of course neither will solve the problem of not enough memory if
> your file is simply too large. In that case you will may want to put
> your data into a database and have R connect to it and retrieve the
> data in smaller chunks as required.
> cu
> Philipp
______________________________________________ mailing list PLEASE do read the posting guide and provide commented, minimal, self-contained, reproducible code.
Received on Thu 06 Mar 2008 - 05:04:58 GMT

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.2.0, at Thu 06 Mar 2008 - 06:30:19 GMT.

Mailing list information is available at Please read the posting guide before posting to the list.

list of date sections of archive