Principal components analysis is an important and well-studied subject in statistics and signal processing. Several algorithms for solving this problem exist, and could be mostly grouped into one of the following three approaches: adaptation based on Hebbian updates and deflation, optimization of a second order statistical criterion (like reconstruction error or output variance), and fixed point update rules with deflation. In this study, we propose an alternate approach that avoids deflation and gradient-search techniques. The proposed method is an on-line procedure based on recursively updating the eigenvector and eigenvalue matrices with every new sample such that the estimates approximately track their true values as would be calculated analytically from the current sample estimate of the data covariance matrix. The perturbation technique is theoretically shown to be applicable for recursive canonical correlation analysis, as well. The performance of this algorithm is compared with that of a structurally similar matrix perturbation-based method and also with a few other traditional methods like Sanger's rule and APEX.
Read full abstract