[R] Heat map on large sample size

From: Anh Tran <popophobia_at_gmail.com>
Date: Thu, 08 May 2008 13:00:33 -0700


Hey,
I'm trying to generate a heat map of 30,000 fragments from probably 5-10 samples. Windows complains about memory shortage. Should I resort to Unix system?

Also, if I only plot 1000 fragments out, they can finish it rather fast. 5000 would take more than 10 minutes. I don't know what to expect for 30,000...

And on the side note, it seems like R only uses up to 50% of CPU while doing number crunching. Is there anyway to utilize 100% of CPU for R? I'm using R 2.6.2 on Windows XP SP2, average config.

Thanks.

UCLA Neurology Research Lab

-- 
Regards,
Anh Tran

	[[alternative HTML version deleted]]

______________________________________________
R-help_at_r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Received on Thu 08 May 2008 - 20:04:01 GMT

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.2.0, at Thu 08 May 2008 - 21:30:55 GMT.

Mailing list information is available at https://stat.ethz.ch/mailman/listinfo/r-help. Please read the posting guide before posting to the list.

list of date sections of archive