Abstract

The low-rank approximation problem is to approximate optimally, with respect to some norm, a matrix by one of the same dimension but smaller rank. It is known that under the Frobenius norm, the best low-rank approximation can be found by using the singular value decomposition (SVD). Although this is no longer true under weighted norms in general, it is demonstrated here that the weighted low-rank approximation problem can be solved by finding the subspace that minimizes a particular cost function. A number of advantages of this parameterization over the traditional parameterization are elucidated. Finding the minimizing subspace is equivalent to minimizing a cost function on the Grassmann manifold. A general framework for constructing optimization algorithms on manifolds is presented and it is shown that existing algorithms in the literature are special cases of this framework. Within this framework, two novel algorithms (a steepest descent algorithm and a Newton-like algorithm) are derived for solving the weighted low-rank approximation problem. They are compared with other algorithms for low-rank approximation as well as with other algorithms for minimizing a cost function on a Grassmann manifold.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call