Abstract

The problem of affine rank minimization seeks to find the minimum rank matrix that satisfies a set of linear equality constraints. Generally, since affine rank minimization is NP-hard, a popular heuristic method is to minimize the nuclear norm that is a sum of singular values of the matrix variable. A recent intriguing paper shows that if the linear transform that defines the set of equality constraints is nearly isometrically distributed and the number of constraints is at least O(r(m + n) logmn), where r and m times n are the rank and size of the minimum rank matrix, minimizing the nuclear norm yields exactly the minimum rank matrix solution. Unfortunately, it takes a large amount of computational complexity and memory buffering to solve the nuclear norm minimization problem with known nearly isometric transforms. This paper presents a fast and efficient algorithm for nuclear norm minimization that employs structurally random matrices for its linear transform and a projected subgradient method that exploits the unique features of structurally random matrices to substantially speed up the optimization process. Theoretically, we show that nuclear norm minimization using structurally random linear constraints guarantees the minimum rank matrix solution if the number of linear constraints is at least O(r(m+n) log <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">3</sup> mn). Extensive simulations verify that structurally random transforms still retain optimal performance while their implementation complexity is just a fraction of that of completely random transforms, making them promising candidates for large scale applications.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call