[R] Memory problem using predict function

From: Brad Timm <bradtimm_at_gmail.com>
Date: Mon, 17 Dec 2007 09:41:49 -0500


I am trying to make a predicted vegetation map using the predict ( ) function and am running into an issue with memory size

Specifically I am building a random forest classification (dataframe = " vegmap.rf") using the randomForest library and then am trying to apply results from that to construct a predicted map (dataframe ="testvegmap.pred "):

          testvegmap.pred <-predict(vegmap.rf, veg)

And when I try to run this I get a message of: "cannot allocate vector of size 88.0Mb"

I have used the series of commands below to increase the memory size to 4000Mb (the largest I seemingly can expand to):

          memory.size(max=FALSE)
          memory.limit(size=4000)

Any suggestions? Is my only option to reduce the size of the area I am trying to make a predicted map of?

Thanks
Brad

        [[alternative HTML version deleted]]



R-help_at_r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code. Received on Mon 17 Dec 2007 - 15:15:20 GMT

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.2.0, at Mon 17 Dec 2007 - 16:30:19 GMT.

Mailing list information is available at https://stat.ethz.ch/mailman/listinfo/r-help. Please read the posting guide before posting to the list.