Abstract

Oja's principal subspace algorithm is a well-known and powerful technique for learning and tracking principal information in time series. A thorough investigation of the convergence property of Oja's algorithm is undertaken in this paper. The asymptotic convergence rates of the algorithm is discovered. The dependence of the algorithm on its initial weight matrix and the singularity of the data covariance matrix is comprehensively addressed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call