Abstract

The problem of low-rank approximation with convex constraints, which appears in data analysis, system identification, model order reduction, low-order controller design, and low-complexity modeling is considered. Given a matrix, the objective is to find a low-rank approximation that meets rank and convex constraints while minimizing the distance to the matrix in the squared Frobenius norm. In many situations, this nonconvex problem is convexified by nuclear-norm regularization. However, we will see that the approximations obtained by this method may be far from optimal. In this paper, we propose an alternative convex relaxation that uses the convex envelope of the squared Frobenius norm and the rank constraint. With this approach, easily verifiable conditions are obtained under which the solutions to the convex relaxation and the original nonconvex problem coincide. A semidefinite programming representation of the convex envelope is derived, which allows us to apply this approach to several known problems. Our example on optimal low-rank Hankel approximation/model reduction illustrates that the proposed convex relaxation performs consistently better than nuclear-norm regularization and may outperform balanced truncation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call