Abstract

Abstract For regression models with first-order autocorrelated disturbances, the traditional prescription of econometricians is to correct for serial correlation by using appropriate estimation techniques such as the Cochrane-Orcutt, Hildreth-Lu, or Prais-Winsten procedures. This suggestion, however, does not take into account the fact that in most economic applications the independent variables contain measurement errors. When there are errors in the variables, ordinary least squares, without any attempt to correct for serial correlation , may, in several cases, yield less unreliable and misleading results (in terms of biases and mean squared errors of the estimators). This paradoxical finding is demonstrated analytically for the asymptotic case and illustrated for finite samples by Monte Carlo experiments, for the regression model with a single stationary explanatory variable. The Monte Carlo experiments also suggest that the same conclusion holds if the independent variable follows a random walk, at least for medium-size samples.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call