Abstract

In this paper, we propose new adaptive algorithms for the extraction and tracking of the least (minor) or eventually, principal eigenvectors of a positive Hermitian covariance matrix. The main advantage of our proposed algorithms is their low computational complexity and numerical stability even in the minor component analysis case. The proposed algorithms are considered fast in the sense that their computational cost is O ( n p ) flops per iteration where n is the size of the observation vector and p < n is the number of eigenvectors to estimate. We consider OJA-type minor component algorithms based on the constraint and non-constraint stochastic gradient technique. Using appropriate fast orthogonalization procedures, we introduce new fast algorithms that extract the minor (or principal) eigenvectors and guarantee good numerical stability as well as the orthogonality of their weight matrix at each iteration. In order to have a faster convergence rate, we propose a normalized version of these algorithms by seeking the optimal step-size. Our algorithms behave similarly or even better than other existing algorithms of higher complexity as illustrated by our simulation results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call