Abstract

Recovering structured models (e.g., sparse or group-sparse vectors, low-rank matrices) given a few linear observations have been well-studied recently. In various applications in signal processing and machine learning, the model of interest is structured in several ways, for example, a matrix that is simultaneously sparse and low rank. Often norms that promote the individual structures are known, and allow for recovery using an orderwise optimal number of measurements (e.g., $\ell _{1}$ norm for sparsity, nuclear norm for matrix rank). Hence, it is reasonable to minimize a combination of such norms. We show that, surprisingly, using multiobjective optimization with these norms can do no better, orderwise, than exploiting only one of the structures, thus revealing a fundamental limitation in sample complexity. This result suggests that to fully exploit the multiple structures, we need an entirely new convex relaxation. Further, specializing our results to the case of sparse and low-rank matrices, we show that a nonconvex formulation recovers the model from very few measurements (on the order of the degrees of freedom), whereas the convex problem combining the $\ell _{1}$ and nuclear norms requires many more measurements, illustrating a gap between the performance of the convex and nonconvex recovery problems. Our framework applies to arbitrary structure-inducing norms as well as to a wide range of measurement ensembles. This allows us to give sample complexity bounds for problems such as sparse phase retrieval and low-rank tensor completion.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call