Abstract
We introduce a class of adaptive filters based on sequential adaptive eigendecomposition (subspace tracking) of the data covariance matrix. These new algorithms are completely rank revealing, and hence, they can perfectly handle the following two relevant data cases where conventional recursive least squares (RLS) methods fail to provide satisfactory results: (1) highly oversampled smooth data with rank deficient of almost rank deficient covariance matrix and (2) noise-corrupted data where a signal must be separated effectively from superimposed noise. This paper contradicts the widely held belief that rank revealing algorithms must be computationally more demanding than conventional recursive least squares. A spatial RLS adaptive filter has a complexity of O(N/sup 2/) operations per time step, where N is the filter order. The corresponding low-rank adaptive filter requires only O(Nr) operations per time step, where r/spl les/N denotes the rank of the data covariance matrix. Thus, low-rank adaptive filters can be computationally less (or even much less) demanding, depending on the order/rank ratio N/r or the compressibility of the signal. Simulation results substantiate our claims. This paper is devoted to the theory and application of fast orthogonal iteration and bi-iteration subspace tracking algorithms.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.