Abstract

The U.S. prewar output series exhibit smaller shock-persistence than postwar-series. Some studies suggest this may be due to linear interpolation used to generate missing prewar data. Monte Carlo simulations that support this view generate large standard-errors, making such inference imprecise. We assess analytically the effect of linear interpolation on a nonstationary process. We find that interpolation indeed reduces shock-persistence, but the interpolated series can still exhibit greater shock-persistence than a pure random walk. Moreover, linear interpolation makes the series periodically nonstationary, with parameters of the data generating process and the length of the interpolation time-segments affecting shock-persistence in conflicting ways.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.