<< Chapter < Page | Chapter >> Page > |
DFT can be expensive to compute directly $$\forall k, 0\le k\le N-1\colon X(k)=\sum_{n=0}^{N-1} x(n)e^{-(i\times 2\pi \frac{k}{N}n)}$$
For each $k$ , we must execute:
This " $O(N^{2})$ " computation rapidly gets out of hand, as $N$ gets large:
$N$ | 1 | 10 | 100 | 1000 | $10^{6}$ |
$N^{2}$ | 1 | 100 | 10,000 | $10^{6}$ | $10^{12}$ |
The FFT provides us with a much more efficient way of computing the DFT. The FFT requires only " $O(N\lg N)$ " computations to compute the $N$ -point DFT.
$N$ | 10 | 100 | 1000 | $10^{6}$ |
$N^{2}$ | 100 | 10,000 | $10^{6}$ | $10^{12}$ |
$N\lg N$ | 10 | 200 | 3000 | $6E6$ |
How long is $10^{12}\mathrm{sec}$ ? More than 10 days! How long is $6E6\mathrm{sec}$ ?
The FFT and digital computers revolutionized DSP (1960 - 1980).
$${W}_{N}^{(k(N-n))}={W}_{N}^{-(kn)}=\overline{{W}_{N}^{(kn)}}$$ $$e^{-(i\times 2\pi \frac{k}{N}(N-n))}=e^{i\times 2\pi \frac{k}{N}n}=\overline{e^{-(i\times 2\pi \frac{k}{N}n)}}$$
$${W}_{N}^{(kn)}={W}_{N}^{(k(N+n))}={W}_{N}^{((k+N)n)}$$ $$e^{-(i\frac{2\pi}{N}kn)}=e^{-(i\frac{2\pi}{N}k(N+n))}=e^{-(i\frac{2\pi}{N}(k+N)n)}$$ $${W}_{N}=e^{-(i\frac{2\pi}{N})}$$
$N$ is even , so we can complete $X(k)$ by separating $x(n)$ into two subsequences each of length $\frac{N}{2}$ . $$x(n)\to \begin{cases}\frac{N}{2} & \text{if $n=\mathrm{even}$}\\ \frac{N}{2} & \text{if $n=\mathrm{odd}$}\end{cases}$$ $$\forall k, 0\le k\le N-1\colon X(k)=\sum_{n=0}^{N-1} x(n){W}_{N}^{(kn)}$$ $$X(k)=\sum_{n=2r} x(n){W}_{N}^{(kn)}+\sum_{n=2r+1} x(n){W}_{N}^{(kn)}$$ where $0\le r\le \frac{N}{2}-1$ . So
Why would we want to do this? Because it is more efficient!
For $N=1000$ , $$N^{2}=10^{6}$$ $$\frac{N^{2}}{2}+N=\frac{10^{6}}{2}+1000$$ Because 1000 is small compared to 500,000, $$\frac{N^{2}}{2}+N\approx \frac{10^{6}}{2}$$
So why stop here?! Keep decomposing. Break each of the
$\frac{N}{2}$ -point DFTs into two
$\frac{N}{4}$ -point DFTs,
We can keep decomposing: $$\frac{N}{2^{1}}=\{\frac{N}{2}, \frac{N}{4}, \frac{N}{8}, , \frac{N}{2^{(m-1)}}, \frac{N}{2^{m}}\}=1$$ where $$m=\log_{2}N=\text{times}$$
Computational cost: $N$ -pt DFTtwo $\frac{N}{2}$ -pt DFTs. The cost is $N^{2}\to 2\left(\frac{N}{2}\right)^{2}+N$ . So replacing each $\frac{N}{2}$ -pt DFT with two $\frac{N}{4}$ -pt DFTs will reduce cost to $$2(2\left(\frac{N}{4}\right)^{2}+\frac{N}{2})+N=4\left(\frac{N}{4}\right)^{2}+2N=\frac{N^{2}}{2^{2}}+2N=\frac{N^{2}}{2^{p}}+pN$$ As we keep going $p=\{3, 4, , m\}$ , where $m=\log_{2}N$ . We get the cost $$\frac{N^{2}}{2^{\log_{2}N}}+N\log_{2}N=\frac{N^{2}}{N}+N\log_{2}N=N+N\log_{2}N$$ $N+N\log_{2}N$ is the total number of complex adds and mults.
For large $N$ , $\mathrm{cost}\approx N\log_{2}N$ or " $O(N\log_{2}N)$ ", since $(N\log_{2}N, N)$ for large $N$ .
Notification Switch
Would you like to follow the 'Fundamentals of signal processing' conversation and receive update notifications?