Abstract

For a discrete time second-order stationary process, the Levinson–Durbin recursion is used to determine the coefficients of the best linear predictor of the observation at time k + 1 , given k previous observations, best in the sense of minimizing the mean square error. The coefficients determined by the recursion define a Levinson–Durbin sequence. We also define a generalized Levinson–Durbin sequence and note that binomial coefficients form a special case of a generalized Levinson–Durbin sequence. All generalized Levinson–Durbin sequences are shown to obey summation formulas which generalize formulas satisfied by binomial coefficients. Levinson–Durbin sequences arise in the construction of several autoregressive model coefficient estimators. The least squares autoregressive estimator does not give rise to a Levinson–Durbin sequence, but least squares fixed point processes, which yield least squares estimates of the coefficients unbiased to order 1 / T , where T is the sample length, can be combined to construct a Levinson–Durbin sequence. By contrast, analogous fixed point processes arising from the Yule–Walker estimator do not combine to construct a Levinson–Durbin sequence, although the Yule–Walker estimator itself does determine a Levinson–Durbin sequence. The least squares and Yule–Walker fixed point processes are further studied when the mean of the process is a polynomial time trend that is estimated by least squares.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call