Re: [R] problem

From: Philipp Pagel <>
Date: Wed, 05 Mar 2008 13:06:37 +0100

On Wed, Mar 05, 2008 at 12:32:19PM +0100, Erika Frigo wrote:
> My file has not only more than a million values, but more than a million
> rows and moreless 30 columns (it is a productive dataset for cows), infact
> with read.table i'm not able to import it.
> It is an xls file.

read.table() expects clear text -- e.g. csv or tab separated in the case of read.delim(). If your file is in xls format the simplest option would be to export the data to CSV format from Excel.

If for some reason that is not an option please have a look at the "R Data Import/Export" manual.

Of course neither will solve the problem of not enough memory if your file is simply too large. In that case you will may want to put your data into a database and have R connect to it and retrieve the data in smaller chunks as required.



Dr. Philipp Pagel                              Tel.  +49-8161-71 2131
Lehrstuhl für Genomorientierte Bioinformatik   Fax.  +49-8161-71 2186
Technische Universität München
Wissenschaftszentrum Weihenstephan
85350 Freising, Germany
Institut für Bioinformatik und Systembiologie / MIPS
Helmholtz Zentrum München -
Deutsches Forschungszentrum für Gesundheit und Umwelt
Ingolstädter Landstrasse 1
85764 Neuherberg, Germany

______________________________________________ mailing list
PLEASE do read the posting guide
and provide commented, minimal, self-contained, reproducible code.
Received on Wed 05 Mar 2008 - 12:25:50 GMT

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.2.0, at Thu 06 Mar 2008 - 05:30:19 GMT.

Mailing list information is available at Please read the posting guide before posting to the list.

list of date sections of archive