<< Chapter < Page Chapter >> Page >

This book focuses on the discrete Fourier transform (DFT), discrete convolution, and, particularly, the fast algorithmsto calculate them. These topics have been at the center of digital signal processing since its beginning, and new results in hardware,theory and applications continue to keep them important and exciting.

As far as we can tell, Gauss was the first to propose the techniques that we now call the fast Fourier transform (FFT) forcalculating the coefficients in a trigonometric expansion of an asteroid's orbit in 1805 [link] . However, it was the seminal paper by Cooley and Tukey [link] in 1965 that caught the attention of the science and engineering community and, in a way,founded the discipline of digital signal processing (DSP).

The impact of the Cooley-Tukey FFT was enormous. Problems could be solved quickly that were not even considered a few years earlier.A flurry of research expanded the theory and developed excellent practical programs as well as opening new applications [link] . In 1976, Winograd published a short paper [link] that set a second flurry of research in motion [link] . This was another type of algorithm that expanded the data lengths that could be transformed efficiently and reduced the number of multiplications required. The ground work for this algorithm had be set earlier by Good [link] and by Rader [link] . In 1997 Frigo and Johnson developed a program they called the FFTW (fastest Fourier transform in the west) [link] , [link] which is a composite of many of ideas in other algorithms as well as new results to give a robust, very fast system for general data lengths on a variety of computer and DSP architectures. This work won the 1999 Wilkinson Prize for Numerical Software.

It is hard to overemphasis the importance of the DFT, convolution, and fast algorithms. With a history that goesback to Gauss [link] and a compilation of references on these topics that in 1995 resulted in over 2400entries [link] , the FFT may be the most important numerical algorithm in science, engineering, and applied mathematics.New theoretical results still are appearing, advances in computers and hardware continually restate the basicquestions, and new applications open new areas for research. It is hoped that this book will provide the background, references,programs and incentive to encourage further research and results in this area as well as provide tools for practical applications.

Studying the FFT is not only valuable in understanding a powerful tool, it is also a prototype or example of how algorithms can be madeefficient and how a theory can be developed to define optimality. The history of this development also gives insight into the process of research where timing and serendipity play interesting roles.

Much of the material contained in this book has been collected over 40 years of teaching and research in DSP, therefore, it is difficult toattribute just where it all came from. Some comes from my earlier FFT book [link] which was sponsored by Texas Instruments and some from the FFT chapter in [link] . Certainly the interaction with people like Jim Cooley and CharlieRader was central but the work with graduate students and undergraduates was probably the most formative. I wouldparticularly like to acknowledge Ramesh Agarwal, Howard Johnson, Mike Heideman, Henrik Sorensen, Doug Jones, Ivan Selesnick,Haitao Guo, and Gary Sitton. Interaction with my colleagues, Tom Parks, Hans Schuessler, Al Oppenheim, and Sanjit Mitra has been essential overmany years. Support has come from the NSF, Texas Instruments, and the wonderful teaching and research environment at Rice Universityand in the IEEE Signal Processing Society.

Several chapters or sections are written by authors who have extensive experience and depth working on the particular topics. Ivan Selesnick had written several papers on the design of short FFTs to be used in the prime factoralgorithm (PFA) FFT and on automatic design of these short FFTs. Markus Püschel has developed a theoretical framework for “Algebraic SignalProcessing" which allows a structured generation of FFT programs and a system called “Spiral" for automaticallygenerating algorithms specifically for an architicture. Steven Johnson along with his colleague Matteo Frigo created, developed, and now maintainsthe powerful FFTW system: the Fastest Fourier Transform in the West. I sincerely thank these authors for their significant contributions.

I would also like to thank Prentice Hall, Inc. who returned the copyright on The DFT as Convolution or Filtering of Advanced Topics in Signal Processing [link] around which some of this book is built. The content of this book is in the Connexions (http://cnx.org/content/col10550/) repository and, therefore, is available for on-line use, pdf down loading, or purchase as a printed, bound physical book. I certainly want to thank Daniel Williamson, Amy Kavalewitz, and the staff of Connexions for their invaluable help. Additional FFT material can be found in Connexions, particularly content by Doug Jones [link] , Ivan Selesnick [link] , and Howard Johnson, [link] . Note that this book and all the content in Connexions are copyrighted under the Creative Commons Attribution license(http://creativecommons.org/).

If readers find errors in any of the modules of this collection or have suggestions for improvements or additions, please email the author of the collection or module.

C. Sidney Burrus

Houston, Texas

October 20, 2008

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Fast fourier transforms. OpenStax CNX. Nov 18, 2012 Download for free at http://cnx.org/content/col10550/1.22
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Fast fourier transforms' conversation and receive update notifications?

Ask