Abstract

In this letter, we address the problem of reconstructing the common nonzero support of multiple joint sparse vectors from their noisy and underdetermined linear measurements. The support recovery problem is formulated as the selection of nonnegative hyperparameters of a correlation-aware, joint sparsity inducing Gaussian prior. The hyperparameters are recovered as a nonnegative sparse solution of covariance-matching constraints formulated in the observation space by solving a sequence of proximal regularized convex optimization problems. For proximal regularization based on Von Neumann Bregman matrix divergence, an exponentiated gradient (EG) update is proposed, which when applied iteratively, converges to hyperparameters with the correct sparse support. Compared to existing multiple measurement vector support recovery algorithms, the proposed multiplicative EG update has a significantly lower computational and storage complexity and takes fewer iterations to converge. We empirically demonstrate that the support-recovery algorithm based on the proposed EG update can solve million variable support recovery problems in tens of seconds. Additionally, by leveraging its correlation-awareness property, the proposed algorithm can recover supports of size as high as $O(m^2)$ from only $m$ linear measurements per joint sparse vector.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call