<< Chapter < Page | Chapter >> Page > |
To improve the sparsity of images having edges along regular geometric curves, Candès and Donoho (CandesD:99)introduced curvelet frames, with elongated waveforms having different directions, positions, and scales.Images with piecewise regular edges have representations that are asymptotically more sparse by thresholding curveletcoefficients than wavelet coefficients.
In a redundant dictionary , we would like to find the best approximation support λ with vectors, which minimize the error . Chapter 12 proves thatit is equivalent to find λ T , which minimizes the corresponding approximation Lagrangian
for some multiplier T .
Compression and denoising are two applications of redundant dictionary approximations. When compressing signals by quantizingdictionary coefficients, the distortion rate varies, like theLagrangian [link] , with a multiplier T that depends on the quantization step. Optimizing the coder is thus equivalent to minimizing this approximation Lagrangian.For sparse representations, most of the bits are devoted to coding the geometry of the sparse approximation set λ T in γ .
Estimators reducing noise from observations are also optimized by finding a best orthogonal projector over a set ofdictionary vectors. The model selection theory of Barron, Birgé, and Massart (massart-birge-barron) proves that finding , which minimizes this same Lagrangian , defines an estimator that has a risk on the same order asthe minimum approximation error up to a logarithmic factor. This is similar to the optimality result obtained forthresholding estimators in an orthonormal basis.
The bad news is that minimizing the approximation Lagrangian L 0 is an NP-hard problem and is therefore computationally intractable. It is necessary therefore to find algorithms that are sufficiently fastto compute suboptimal, but “good enough,” solutions.
To reduce the complexity of optimal approximations, the search can be reduced to subfamilies oforthogonal dictionary vectors. In a dictionary of orthonormal bases, any family of orthogonal dictionaryvectors can be complemented to form an orthogonal basis included in . As a result, the best approximation of from orthogonal vectors in is obtained by thresholding the coefficients of in a “best basis” in .
For tree dictionaries of orthonormal bases obtained by a recursive split of orthogonal vector spaces, the fast, dynamic programmingalgorithm of Coifman and Wickerhauser (CoifmanMW:92) finds such a best basis with operations, where P is the dictionary size.
Wavelet packet and local cosine bases are examples of tree dictionaries of time-frequencyorthonormal bases of size . A best basis is atime-frequency tiling that is the best match to the signal time-frequency structures.
To approximate geometrically regular edges, wavelets are not as efficientas curvelets, but wavelets provide more sparse representations of singularities that are not distributed along geometrically regular curves.Bandlet dictionaries, introduced by Le Pennec,Mallat, and Peyré (bandelets-siam, bandlets-peyre), are dictionaries of orthonormal bases that can adaptto the variability of images' geometric regularity. Minimax optimal asymptotic rates are derived for compression and denoising.
Notification Switch
Would you like to follow the 'A wavelet tour of signal processing, the sparse way' conversation and receive update notifications?