[R] memory cleaning

From: Claude Messiaen - Urc Necker <stagiaire2.urc_at_nck.ap-hop-paris.fr>
Date: Fri 22 Jul 2005 - 23:35:31 EST


Hi R Users,
After some research I haven't find what I want. I'm manipulating a dataframe with 70k rows and 30 variables, and I run out of memory when exporting this in a *.txt file

after some computing I have used :

> memory.size()/1048576.0

[1] 103.7730

and I make my export :

> write.table(cox,"d:/tablefinal2.txt",row.names=F,sep=';')
> memory.size()/1048576.0

[1] 241.9730

I'm surprised so I try removing some objects :
> rm (trait,tany,tnor,toth,suivauxdany,dnor,doth,mod1,
mod2,mod3,lok1,lok2,lok3,aux,risque,risk) and check memory space :
> memory.size()/1048576.0

[1] 242.1095

First, I don't understand why when removing objects the memory used increase ? Next, why the memory used double when I make an export ? I look forward to your reply

Claude

        [[alternative HTML version deleted]]



R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html Received on Fri Jul 22 23:43:13 2005

This archive was generated by hypermail 2.1.8 : Fri 03 Mar 2006 - 03:33:55 EST