Abstract

This note explores the relations between two different methods. The first one is the Alternating Least Squares (ALS) method for calculating a rank-k approximation of a real m×n matrix, A. This method has important applications in nonnegative matrix factorizations, in matrix completion problems, and in tensor approximations. The second method is called Orthogonal Iterations. Other names of this method are Subspace Iterations, Simultaneous Iterations, and block-Power method. Given a real symmetric matrix, G, this method computes k dominant eigenvectors of G. To see the relation between these methods we assume that G = AT A. It is shown that in this case the two methods generate the same sequence of subspaces, and the same sequence of low-rank approximations. This equivalence provides new insight into the convergence properties of both methods.

Highlights

  • The alternating least squares (ALS) method has several important applications, e.g., [1]-[54]

  • As noted in the introduction, the relations between Alternating Least Squares (ALS) and the block-Power method were recently observed in the context of matrix completion algorithms

  • The related matrix completion algorithms differ substantially from the classic versions discussed in this paper

Read more

Summary

Introduction

The alternating least squares (ALS) method has several important applications, e.g., [1]-[54]. It is widely used in problems where standard SVD methods are not applicable. These problems include, for example, nonnegative matrix factorization [6] [17] [28] [35] [36], matrix completion problems [5] [14] [15] [20] [23] [24] [30] [54], and tensor approximations [12] [18] [29] [48] [49] [50]. Let A∈ m×n be a given large sparse matrix, let k be a given integer

Dax DOI
Orthogonal Iterations
Equivalence Relations
Concluding Remarks
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call