Abstract

Principal components analysis is an important and well-studied subject in statistics and signal processing. The literature has an abundance of algorithms for solving this problem, where most of these algorithms could be grouped into one of the following three approaches: adaptation based on Hebbian updates and deflation, optimization of a second-order statistical criterion (like reconstruction error or output variance), and fixed point update rules with deflation. In this paper, we take a completely different approach that avoids deflation and the optimization of a cost function using gradients. The proposed method updates the eigenvector and eigenvalue matrices simultaneously with every new sample such that the estimates approximately track their true values as would be calculated from the current sample estimate of the data covariance matrix. The performance of this algorithm is compared with that of traditional methods like Sanger's rule and APEX, as well as a structurally similar matrix perturbation-based method.

Highlights

  • Principal components analysis (PCA) is a well-known statistical technique that has been widely applied to solve important signal processing problems like feature extraction, signal estimation, detection, and speech separation [1, 2, 3, 4]

  • Most of the analytical methods require extensive matrix operations and they are unsuited for real-time applications. In many applications such as direction of arrival (DOA) tracking, adaptive subspace estimation, and so forth, signal statistics change over time rendering the block methods virtually unacceptable

  • We recently explored a simultaneous principal component extraction algorithm called SIPEX [14] which reduced the gradient search only to the space of orthonormal matrices by using Givens rotations

Read more

Summary

INTRODUCTION

Principal components analysis (PCA) is a well-known statistical technique that has been widely applied to solve important signal processing problems like feature extraction, signal estimation, detection, and speech separation [1, 2, 3, 4]. The recently proposed fixed-point PCA algorithm [13] showed fast convergence with little or no change in complexity compared with gradient methods This method and most of the existing methods in literature rely on using the standard deflation technique, which brings in sequential convergence of principal components that potentially reduces the overall speed of convergence. We will present an algorithm that undertakes a similar perturbation approach, but in contrast, the covariance matrix will be decomposed into its eigenvectors and eigenvalues at all times, which will reduce the perturbation step to be employed on the diagonal eigenvalue matrix This further restriction of structure, as expected, alleviates the difficulties encountered in the operation of the previous first-order perturbation algorithm, resulting in a fast converging and accurate subspace tracking algorithm. We conclude the paper with remarks and observations about the algorithm

PROBLEM DEFINITION
RECURSIVE PCA DESCRIPTION
Perturbation analysis for rank-one update
The recursive PCA algorithm
Extension to complex-valued PCA
NUMERICAL EXPERIMENTS
Convergence speed analysis
Comparison with first-order perturbation PCA
Direction of arrival estimation
An example with 20 dimensions
Findings
CONCLUSIONS
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.