Copyright © by SIAM. Unauthorized reproduction of this article is prohibited.
36 ALFRED M. BRUCKSTEIN, DAVID L. DONOHO, AND MICHAEL ELAD
Beyond conceptual issues of uniqueness and verification of solutions, one is easily
overwhelmed by the apparent difficulty of solving (P
0
). This is a classical problem
of combinatorial search; one sweeps exhaustively through all possible sparse subsets,
generating corresponding subsystems b = A
S
x
S
, where A
S
denotes the matrix with
|S| columns chosen from those columns of A with indices in S, and checking whether
b = A
S
x
S
can be solved. The complexity of exhaustive search is exponential in
m and, indeed, it has been proven that (P
0
) is, in general, NP-hard [125]. Thus, a
mandatory and crucial set of questions arises: Can (P
0
) be efficiently solved by some
other means? Can approximate solutions be accepted? How accurate can those be?
What kind of approximations will work?
In fact, why should anyone believe that any progress of any kind is possible here?
Here is a hint. Let x
1
denote the
1
norm
i
|x
i
|, and consider the problem (P
1
)
obtained by setting J(x)=J
1
(x)=x
1
. This problem is somehow intermediate
between (P
2
) and (P
0
). It is a convex optimization problem, and among convex
problems it is in some sense the one closest to (P
0
). We will see below [49, 93, 46]
that for matrices A with incoherent columns, whenever (P
0
) has a sufficiently sparse
solution, that solution is unique and is equal to the solution of (P
1
). Since (P
1
)is
convex, the solution can thus be obtained by standard optimization tools—in fact,
linear programming. Even more surprisingly, for the same class A, some very simple
greedy algorithms (GAs) can also find the sparsest solution to (P
0
) [156].
Today many pure and applied mathematicians are pursuing results concerning
sparse solutions to underdetermined systems of linear equations. The results achieved
so far range from identifying conditions under which (P
0
) has a unique solution, to
conditions under which (P
0
) has the same solution as (P
1
), to conditions under which
the solution can be found by some “pursuit” algorithm. Extensions range even more
widely, from less restrictive notions of sparsity to the impact of noise, the behavior of
approximate solutions, and the properties of problem instances defined by ensembles
of random matrices. We hope to introduce the reader to some of this work below and
provide some appropriate pointers to the growing literature.
1.2. The Signal Processing Perspective. We now know that finding sparse so-
lutions to underdetermined linear systems is a better-behaved and much more prac-
tically relevant notion than we might have supposed just a few years ago.
In parallel with this development, another insight has been developing in signal
and image processing, where it has been found that many media types (still imagery,
video, acoustic) can be sparsely represented using transform-domain methods, and in
fact many important tasks dealing with such media can be fruitfully viewed as finding
sparse solutions to underdetermined systems of linear equations.
Many readers will be familiar with the media encoding standard JPEG and its
successor, JPEG-2000 [153]. Both standards are based on the notion of transform
encoding. The data vector representing the raw pixel samples are transformed—
i.e., represented in a new coordinate system—and the resulting coordinates are then
processed to produce the encoded bitstream. JPEG relies on the discrete cosine
transform (DCT)—a variant of the Fourier transform—while JPEG-2000 relies on the
discrete wavelet transform (DWT) [116]. These transforms can be viewed analytically
as rotations of coordinate axes from the standard Euclidean basis to a new basis. Why
does it make sense to change coordinates in this way? Sparsity provides the answer.
The DCT of media content often has the property that the first several transform
coefficients are quite large and later ones are very small. Treating the later coefficients
as zeros and approximating the early ones by quantized representations yields an
- 1
- 2
- 3
前往页