Date: Tue, 11 Feb 1997 09:04:03 -0600 (CST) From: Bill Simpson <wsimpson@uwinnipeg.ca> To: Ross Ihaka <ihaka@stat.auckland.ac.nz> Subject: Re: R-alpha: What we're up to ... In-Reply-To: <199702110826.VAA02992@stat13.stat.auckland.ac.nz> > > 2. Complex arithmetic. This has finally hit the top of the > priority heap. I am curious about this. Can someone explain why this is important? I know a lot about linear systems theory and system identification, and I know the transfer function H(s) is a function of complex freq s. HOWEVER in practice even in FORTRAN which does have innate complex variables, people split the data into real and imaginary parts. In fact all the fastest FFTs I know about do this. Moreover, are there people out there who have time (or space) domain data that are complex? Doesn't almost everyone have real data? BTW for real data, doing a FFT of two arrays, one real and one complex consisting of zeros is dreadfully inefficient (not sure; is this what R does now?). There are very old methods of doing FFT of real data by packing TWO real arrays into the real and complex input, and then separting out the results. That will will be about twice as fast though to me distastefully kludgy. The better way is to use a REAL FFT such as published by Sorenson in the 1980s. For C code, see http://www.spektracom.de/~arndt/fxt/fxtpage.html I translated Sorenson's real fft to C, and I could also do the IFFT (never had a need so never did it). Perhaps a parallel pair of functions rfft rifft would be handy. Just curious about the need for complex variables. Maybe the only motivation is to be compatible with Splus. Bill Simpson =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- r-testers mailing list -- For info or help, send "info" or "help", To [un]subscribe, send "[un]subscribe" (in the "body", not the subject !) To: r-testers-request@stat.math.ethz.ch =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-