Abstract

Low-rank matrix recovery (LMR) is a rank minimization problem subject to linear equality constraints, and it arises in many fields such as signal and image processing, statistics, computer vision, and system identification and control. This class of optimization problems is generally𝒩𝒫hard. A popular approach replaces the rank function with the nuclear norm of the matrix variable. In this paper, we extend and characterize the concept ofs-goodness for a sensing matrix in sparse signal recovery (proposed by Juditsky and Nemirovski (Math Program, 2011)) to linear transformations in LMR. Using the two characteristics-goodness constants,γsandγ^s, of a linear transformation, we derive necessary and sufficient conditions for a linear transformation to bes-good. Moreover, we establish the equivalence ofs-goodness and the null space properties. Therefore,s-goodness is a necessary and sufficient condition for exacts-rank matrix recovery via the nuclear norm minimization.

Highlights

  • Low-rank matrix recovery (LMR for short) is a rank minimization problem (RMP) with linear constraints or the affine matrix rank minimization problem which is defined as follows: minimize rank (X), (1)subject to AX = b, where X ∈ Rm×n is the matrix variable, A : Rm×n → Rp is a linear transformation, and b ∈ Rp

  • In order to characterize the s-goodness of a linear transformation A, we study the basic properties of G-numbers

  • The above theorem says that s-goodness is a necessary and sufficient condition for recovering the low-rank solution exactly via nuclear norm minimization

Read more

Summary

Introduction

Low-rank matrix recovery (LMR for short) is a rank minimization problem (RMP) with linear constraints or the affine matrix rank minimization problem which is defined as follows: minimize rank (X) , (1). When m = n and the matrix X := Diag(x), x ∈ Rn, is diagonal, the LMR (1) reduces to sparse signal recovery (SSR), which is the so-called cardinality minimization problem (CMP): min ‖x‖0 (3). Juditsky and Nemirovski [24] established necessary and sufficient conditions for a Sensing matrix to be “s-good” to allow for exact l1-recovery of sparse signals with s nonzero entries when no measurement noise is present. They demonstrated that these characteristics, difficult to evaluate, lead to verifiable sufficient conditions for exact SSR and to efficiently computable upper bounds on those s for which a given sensing matrix is s-good. For a linear transformation A : Rm×n → Rp, we denote by A∗ : Rp → Rm×n the adjoint of A

Definitions and Basic Properties
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call