Abstract

AbstractSuppose we wish to recover a vector x0 ∈ ℝ𝓂 (e.g., a digital signal or image) from incomplete and contaminated observations y = A x0 + e; A is an 𝓃 × 𝓂 matrix with far fewer rows than columns (𝓃 ≪ 𝓂) and e is an error term. Is it possible to recover x0 accurately based on the data y?To recover x0, we consider the solution x# to the 𝓁1‐regularization problem where ϵ is the size of the error term e. We show that if A obeys a uniform uncertainty principle (with unit‐normed columns) and if the vector x0 is sufficiently sparse, then the solution is within the noise level As a first example, suppose that A is a Gaussian random matrix; then stable recovery occurs for almost all such A's provided that the number of nonzeros of x0 is of about the same order as the number of observations. As a second instance, suppose one observes few Fourier samples of x0; then stable recovery occurs for almost any set of 𝓃 coefficients provided that the number of nonzeros is of the order of 𝓃/(log 𝓂)6.In the case where the error term vanishes, the recovery is of course exact, and this work actually provides novel insights into the exact recovery phenomenon discussed in earlier papers. The methodology also explains why one can also very nearly recover approximately sparse signals. © 2006 Wiley Periodicals, Inc.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.