Abstract

The linear least-mean-square error (LMS) estimate of a scalar random variable given an observation of a vector-valued random variable (data) is well know. Computation of the estimate requires knowledge of the data correlation matrix. Algorithms have been proposed by Griffiths [9] and by Widrow [7] for iterative determination of the estimate of each element from a sequence of scalar random variables given an observation of the corresponding element from a sequence of date vectors when the data correlation matrix is not known. These algorithms are easy to implement, require little storage, and are suitable for real-time processing. Past convergence studies of these algorithms have assumed that the data vectors were mutually independent. In this study some asymptotic properties of these and other related algorithms are derived for a sequence of mutually correlated data vectors. A generalized algorithm is defined for analytic purposes. It is demonstrated for this generalized algorithm that excess mean-square error (as defined by Widrow) can be made arbitrarily small for large values of time in the correlated case. The analysis can be applied to a particular estimation scheme of 1) the particular algorithm can be placed in the generalized form, and 2) the given assumptions are satisfied. The analysis of the generalized algorithm requires that the data vectors possess only a few properties; foremost among these are ergodicity and a form of asymptotic independence. This analysis does not assume any particular probability distribution function nor any particular form of mutual correlation for the data vectors.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call