Abstract
We derive and discuss various generalizations of neural PCA (Principal Component Analysis)-type learning algorithms containing nonlinearities using optimization-based approach. Standard PCA arises as an optimal solution to several different information representation problems. We justify that this is essentially due to the fact that the solution is based on the second-order statistics only. If the respective optimization problems are generalized for nonquadratic criteria so that higher-order statistics are taken into account, their solutions will in general be different. The solutions define in a natural way several meaningful extensions of PCA and give a solid foundation for them. In this framework, we study more closely generalizations of the problems of variance maximization and mean-square error minimization. For these problems, we derive gradient-type neural learning algorithms both for symmetric and hierarchic PCA-type networks. As an important special case, the well-known Sanger's generalized Hebbian algorithm (GHA) is shown to emerge from natural optimization problems.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have