<< Chapter < Page Chapter >> Page >

Pursuit in dictionaries

Approximating signals only from orthogonal vectors brings rigidity that limits the ability tooptimize the representation. Pursuit algorithms remove this constraint with flexible proceduresthat search for sparse, although not necessarily optimal, dictionary approximations. Such approximations are computed by optimizingthe choice of dictionary vectors { φ p } p Λ .

Matching pursuit

Matching pursuit algorithms introduced by Mallat and Zhang (MallatZ:93) are greedy algorithms that optimize approximations by selectingdictionary vectors one by one. The vector in φ p 0 D that best approximates a signal f is

Φ p 0 = argmax p є Ґ | f , Φ p |

and the residual approximation error is

R f = f - f , φ p 0 φ p 0 .

A matching pursuit further approximates the residue R f by selecting another best vector φ p 1 from the dictionary and continues this process over next-order residues R m f , which produces a signal decomposition:

f = m = 0 M - 1 R m f , φ p m φ p m + R M f .

The approximation from the M -selected vectors { φ p m } 0 m < M can be refined with an orthogonal back projection on the space generated by these vectors.An orthogonal matching pursuit further improves this decomposition by orthogonalizing progressively the projection directions φ p m during the decompositon. The resulting decompositions are applied to compression,denoising, and pattern recognition of various types of signals, images, and videos.

Basis pursuit

Approximating f with a minimum number of nonzero coefficients a [ p ] in a dictionary D is equivalent to minimizing the 1 0 norm a 0 , which gives the number of nonzero coefficients.This 1 0 norm is highly nonconvex, which explains why the resulting minimization is NP-hard.Donoho and Chen (DonohoC:95) thus proposed replacing the 1 0 norm by the 1 1 norm a 1 = p γ | a [ p ] | , which is convex. The resultingbasis pursuit algorithm computes a synthesis operator

f = p γ a [ p ] φ p , which minimizes a 1 = p γ | a [ p ] | .

This optimal solution is calculated with a linear programming algorithm.A basis pursuit is computationally more intense than a matching pursuit, but it is a more global optimizationthat yields representations that can be moresparse.

In approximation, compression, or denoising applications, f is recovered with an error bounded by a precision parameter ϵ . The optimization [link] is thus relaxed by finding a synthesis such that

f - p γ a [ p ] φ p ϵ , which minimizes a 1 = p γ | a [ p ] | .

This is a convex minimization problem, with a solution calculated by minimizing the corresponding 1 1 Lagrangian

L 1 ( T , f , a ) = f - p γ a [ p ] φ p 2 + T a 1 ,

where T is a Lagrange multiplier that depends on ϵ . This is called an 1 1 Lagrangian pursuit in this book. A solution a ˜ [ p ] is computed with iterative algorithms that are guaranteed to converge.The number of nonzero coordinates of a ˜ typically decrea-ses as T increases.

Incoherence for support recovery

Matching pursuit and 1 1 Lagrangian pursuits are optimal if they recover the approx-imation support λ T , which minimizes the approximation Lagrangian

L 0 ( T , f , λ ) = f - f λ 2 + T 2 | λ | ,

where f λ is the orthogonal projection of f in the space V λ generated by { φ p } p Λ . This is not always true and depends on λ T . An Exact Recovery Criteria proved byTropp (tropp-multi-omp) guarantees that pursuit algorithms do recover the optimalsupport λ T if

E R C ( λ T ) = max q / λ T p λ T | φ ˜ p , φ q | < 1 ,

where { φ ˜ p } p Λ T is the biorthogonal basis of { φ p } p Λ T in V Λ T . This criterion implies that dictionary vectors φ q outside λ T should have a small inner product with vectors in λ T .

This recovery is stable relative to noise perturbations if { φ p } p Λ has Riesz bounds that are not too far from 1. These vectors should be nearly orthogonal and hence havesmall inner products. These small inner-product conditions are interpreted as a form of incoherence.A stable recovery of λ T is possible if vectors in λ T are incoherent with respect to other dictionary vectors and are incoherentbetween themselves. It depends on the geometric configuration of λ T in γ .

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, A wavelet tour of signal processing, the sparse way. OpenStax CNX. Sep 14, 2009 Download for free at http://cnx.org/content/col10711/1.3
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'A wavelet tour of signal processing, the sparse way' conversation and receive update notifications?

Ask