Abstract

A detailed study of Oja's learning equation in neural networks is undertaken in this paper. Not only are such fundamental issues as existence, uniqueness, and representation of solutions completely resolved, but also the convergence issue is resolved. It is shown that the solution of Oja's equation is exponentially convergent to an equilibrium from any initial value. Moreover, the necessary and sufficient conditions are given on the initial value for the solution to converge to a dominant eigenspace of the associated autocorrelation matrix. As a by-product, this result confirms one of Oja's conjectures that the solution converges to the principal eigenspace from almost all initial values. Some other characteristics of the limiting solution are also revealed. These facilitate the determination of the limiting solution in advance using only the initial information. Two examples are analyzed demonstrating the explicit dependence of the limiting solution on the initial value. In another respect, it is found that Oja's equation is the gradient flow of generalized Rayleigh quotients on a Stiefel manifold.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.