Abstract

We study the use of a zero mean first difference model to forecast the level of a scalar time series that is stationary in levels. Let bias be the average value of a series of forecast errors. Then the bias of forecasts from a misspecified ARMA model for the first difference of the series will tend to be smaller in magnitude than the bias of forecasts from a correctly specified model for the level of the series. Formally, let P be the number of forecasts. Then the bias from the first difference model has expectation zero and a variance that is O(1/P²), while the variance of the bias from the levels model is generally O(1/P). With a driftless random walk as our first difference model, we confirm this theoretical result with simulations and empirical work: random walk bias is generally one-tenth to one-half that of an appropriately specified model fit to levels.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.