<< Chapter < Page | Chapter >> Page > |
S. W. Kahng built upon the findings by Karlovitz by considering the process of finding an adequate for partial updating. He applied Newton-Raphson's method to this problem and proposed a closed form solution for , given by
resulting in
The rest of Kahng's algorithm follows Karlovitz's approach. However, since is fixed, there is no need to perform the linear search at each iteration. Kahng's method has an added benefit: since it uses Newton's method to find , the algorithm tends to converge much faster than previous approaches. It has indeed been shown to converge quadratically. However, Newton-Raphson-based algorithms are not guaranteed to converge globally unless at some point the existing solution lies close enough to the solution, within their radius of convergence [link] . Fletcher, Grant and Hebden [link] derived the same results independently.
Burrus, Barreto and Selesnick
[link] ,
[link] ,
[link] modified Kahng's methods in several important ways in order to improve on their initial and final convergence rates and the method's stability (we refer to this method as BBS). The first improvement is analogous to a
homotopy
[link] . Up to this point all efforts in
filter design attempted to solve the
actual
This fact allows anyone to gradually move from an solution to an solution.
To accelerate initial convergence, the BBS method of Burrus et al. initially solves for by setting and then sets , where is a convergence parameter defined by . Therefore at the -th iteration
where corresponds to the desired solution. The implementation of each iteration follows Karlovitz's method with Kahng's choice of , using the particular selection of given by [link] .
To summarize, define the class of IRLS algorithms as follows: after iterations, given a vector the IRLS iteration requires two steps,
The following is a summary of the IRLS-based algorithms discussed so far and their corresponding updating functions:
Much of the performance of a method is based upon whether it can actually converge given a certain error measure. In the case of the methods described above, both convergence rate and stability play an important role in their performance. Both Karlovitz and RUL methods are supposed to converge linearly, while Kahng's and the BBS methods converge quadratically, since they both use a Newton-based additive update of the weights.
Notification Switch
Would you like to follow the 'Iterative design of l_p digital filters' conversation and receive update notifications?