<< Chapter < Page | Chapter >> Page > |
Approximating signals only from orthogonal vectors brings rigidity that limits the ability tooptimize the representation. Pursuit algorithms remove this constraint with flexible proceduresthat search for sparse, although not necessarily optimal, dictionary approximations. Such approximations are computed by optimizingthe choice of dictionary vectors .
Matching pursuit algorithms introduced by Mallat and Zhang (MallatZ:93) are greedy algorithms that optimize approximations by selectingdictionary vectors one by one. The vector in that best approximates a signal is
and the residual approximation error is
A matching pursuit further approximates the residue by selecting another best vector from the dictionary and continues this process over next-order residues , which produces a signal decomposition:
The approximation from the M -selected vectors can be refined with an orthogonal back projection on the space generated by these vectors.An orthogonal matching pursuit further improves this decomposition by orthogonalizing progressively the projection directions during the decompositon. The resulting decompositions are applied to compression,denoising, and pattern recognition of various types of signals, images, and videos.
Approximating with a minimum number of nonzero coefficients in a dictionary is equivalent to minimizing the norm , which gives the number of nonzero coefficients.This norm is highly nonconvex, which explains why the resulting minimization is NP-hard.Donoho and Chen (DonohoC:95) thus proposed replacing the norm by the norm , which is convex. The resultingbasis pursuit algorithm computes a synthesis operator
This optimal solution is calculated with a linear programming algorithm.A basis pursuit is computationally more intense than a matching pursuit, but it is a more global optimizationthat yields representations that can be moresparse.
In approximation, compression, or denoising applications, is recovered with an error bounded by a precision parameter ϵ . The optimization [link] is thus relaxed by finding a synthesis such that
This is a convex minimization problem, with a solution calculated by minimizing the corresponding Lagrangian
where T is a Lagrange multiplier that depends on ϵ . This is called an Lagrangian pursuit in this book. A solution is computed with iterative algorithms that are guaranteed to converge.The number of nonzero coordinates of typically decrea-ses as T increases.
Matching pursuit and Lagrangian pursuits are optimal if they recover the approx-imation support λ T , which minimizes the approximation Lagrangian
where is the orthogonal projection of in the space V λ generated by . This is not always true and depends on λ T . An Exact Recovery Criteria proved byTropp (tropp-multi-omp) guarantees that pursuit algorithms do recover the optimalsupport λ T if
where is the biorthogonal basis of in . This criterion implies that dictionary vectors φ q outside λ T should have a small inner product with vectors in λ T .
This recovery is stable relative to noise perturbations if has Riesz bounds that are not too far from 1. These vectors should be nearly orthogonal and hence havesmall inner products. These small inner-product conditions are interpreted as a form of incoherence.A stable recovery of λ T is possible if vectors in λ T are incoherent with respect to other dictionary vectors and are incoherentbetween themselves. It depends on the geometric configuration of λ T in γ .
Notification Switch
Would you like to follow the 'A wavelet tour of signal processing, the sparse way' conversation and receive update notifications?