The main task is computing of consistently received data. I have a function named "GiveNextDataPortion()" that returns the list object. The number of function's calls is about 5 millions and thus, I encountered a problem of perfomance.
I use this script:
5 millions iterations of "while" loop takes about 20 seconds on my computer,
it is very quickly, but adding some simple extra operators increases the
time of test a lot.
It will already take more than 10 minutes.
Have you any ideas about the optimization of this test?
Roland Rau-3 wrote:
> diver495 wrote:
>> Using Visual Basic I can complete the same script (simple loop of 5000000 >> itterations) in 0.1 sec. >> Is it realy R not suitable for huge computing.
-- View this message in context: http://www.nabble.com/R-perfomance-question-%21%21%21-tp17984154p17997925.html Sent from the R help mailing list archive at Nabble.com. ______________________________________________ R-help_at_r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.Received on Thu 19 Jun 2008 - 07:17:59 GMT
Archive maintained by Robert King, hosted by
the discipline of
statistics at the
University of Newcastle,
Archive generated by hypermail 2.2.0, at Thu 19 Jun 2008 - 08:31:18 GMT.
Mailing list information is available at https://stat.ethz.ch/mailman/listinfo/r-help. Please read the posting guide before posting to the list.
list of date sections of archive