Abstract

The Multiple Measurement Vector (MMV) problem is central to sparse signal processing, where the goal is to recover the common support of a set of unknown sparse vectors of size $N$ , from $L$ compressed measurement vectors, each of size $M . Recent advances in correlation-aware and Bayesian techniques for MMV models show promising evidence that under appropriate assumptions, it is possible to recover supports of size ( $s$ ) larger than the dimension ( $M$ ) of each measurement vector. However, these results are primarily asymptotic in $L$ , and cannot provide support recovery guarantees for finite $L$ . This paper overcomes such drawback by focusing on a broader family of correlation-aware optimization problems (which include convex problems), and establishing rigorous non-asymptotic probabilistic guarantees in the regime $s>M$ when the measurements are collected using appropriately designed sparse sensor arrays (such as nested and coprime). Assuming the source locations obey a certain “minimum separation” condition, we develop uniform upper bounds on the estimation error that is obeyed by any algorithm belonging to this family, and utilize this bound to ensure probabilistic support recovery in the regime $s>M$ . Our results crucially rely upon (i) the unique geometry of the difference co-array of the sparse arrays, and (ii) certain non-negativity constraints on the optimization variable. As a result of independent interest, we also show that this upper bound is tight with respect to the dimension $N$ . Extensive numerical simulations (including phase transition plots) are presented to validate the theoretical claims.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call