Abstract

This paper derives an identification solution of the ARMA-type optimal linear predictor as a time varying-lattice of arbitrarily fixed dimension for a process whose output signal only is known. The projection technique introduced here leads to a hereditary algorithm that is the adaptive extension to raw data of the authors' previous results on lattice realization from given autocorrelation functions. It produces a minimum-phase linear model of the signal whose nth order whiteness of the associated innovation has the following restricted meaning: orthogonality to an n-dimensional subspace memory of the past in a suitable Hilbert sequence space. The L/sup 2/ metric of that sequence space leads to a least-squares identification algorithm that possesses a certainty equivalence principle with respect to the corresponding realization algorithm (i.e., sample correlation products replace true correlation terms). Due to the detailed state-space time-varying computations, this is possible here while avoiding the well-known side errors from missing correlation products that usually occur in a blunt replacement of the output autocorrelation by averaged sample products. Application examples show the superiority of the hereditary algorithm over classical recursive and nonrecursive algorithms in terms of accuracy, adaptivity, and order reduction capabilities.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call