Abstract

Compressed sensing is the art of reconstructing structured $n$-dimensional vectors from substantially fewer measurements than naively anticipated. A plethora of analytic reconstruction guarantees support this credo. The strongest among them are based on deep results from large-dimensional probability theory that require a considerable amount of randomness in the measurement design. Here, we demonstrate that derandomization techniques allow for considerably reducing the amount of randomness that is required for such proof strategies. More, precisely we establish uniform s-sparse reconstruction guarantees for $C s \log (n)$ measurements that are chosen independently from strength-four orthogonal arrays and maximal sets of mutually unbiased bases, respectively. These are highly structured families of $\tilde{C} n^2$ vectors that imitate signed Bernoulli and standard Gaussian vectors in a (partially) derandomized fashion.

Highlights

  • One can show that at most n+1 different orthonormal bases may exist that have this property in a pairwise fashion [21, Theorem 3.5]. Such a set of n + 1 bases is called a maximal set of mutually unbiased bases (MMUB)

  • With probability at least 1 − 2e−cm, any s-sparse x ∈ Rn can be recovered from y = Ax by solving (1). This result readily generalizes to measurements that are sampled from a maximal set of mutually unbiased bases

  • The nullspace property, as well as its connection to uniform s-sparse recovery readily generalizes to complex-valued s-sparse vectors

Read more

Summary

Motivation

Compressed sensing is the art of effectively reconstructing structured signals from substantially fewer measurements than would naively be required for standard techniques like least squares. Our results highlight that this technique almost allows bridging the gap between existing proof techniques for generic and structured measurements: the results are still strong but require slightly more randomness than choosing vectors uniformly from a bounded orthogonal system, such as Fourier or Hadamard vectors. While we are not able to fully overcome this drawback here, the methods described in this work do limit the amount of randomness required to generate individual structured measurements. We believe that this may help to reduce the discrepancy between “what can be proved” and “what can be done” in a variety of concrete applications

Preliminaries on Compressed Sensing
Partially De-randomizing Signed
Partially Derandomizing Complex Standard Gaussian Vectors
Main Results
PROOFS
Extending the Scope to Subgaussian Measurements
Generalization to Complex-Valued Signals and Partial De-Randomization
Recovery Guarantee for Strength-4 Orthogonal Arrays
Recovery Guarantee for Mutually
EXTENSION TO NOISY MEASUREMENTS
NUMERICAL EXPERIMENTS
Proof of Theorem 7
Proof of Lemma 2
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.