Abstract

This paper investigates the problem of signal estimation from undersampled noisy sub-Gaussian measurements under the assumption of a cosparse model. Based on generalized notions of sparsity, we derive novel recovery guarantees for the $\ell^1$-analysis basis pursuit, enabling accurate predictions of its sample complexity. The corresponding bounds on the number of required measurements do explicitly depend on the Gram matrix of the analysis operator and therefore particularly account for its mutual coherence structure. Our findings defy conventional wisdom which promotes the sparsity of analysis coefficients as the crucial quantity to study. In fact, this common paradigm breaks down completely in many situations of practical interest, for instance, when applying a redundant (multilevel) frame as analysis prior. By extensive numerical experiments, we demonstrate that, in contrast, our theoretical sampling-rate bounds reliably capture the recovery capability of various examples, such as redundant wavelets systems, total variation, or random frames. The proofs of our main results build upon recent achievements in the convex geometry of data mining problems. More precisely, we establish a sophisticated upper bound on the conic Gaussian mean width that is associated with the underlying $\ell^1$-analysis polytope. Due to a novel localization argument, it turns out that the presented framework naturally extends to stable recovery, allowing us to incorporate compressible coefficient sequences as well.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call