Abstract

Abstract: Difficulties with inference in predictive regressions are generally attributed to strong persistence in the predictor series. We show that the major source of the problem is actually the nuisance intercept parameter and propose basing inference on the Restricted Likelihood, which is free of such nuisance location parameters and also possesses small curvature, making it suitable for inference. The bias of the Restricted Maximum Likelihood (REML) estimates is shown to be approximately 50% less than that of the OLS estimates near the unit root, without loss of eýciency. The error in the chi-square approximation to the distribution of the REML based Likelihood Ratio Test (RLRT) for no predictability is shown to be (3/4-rho^2)n^-1 (G3 (x) -G1 (x)) O(n^-2); where rho Power under local alternatives is obtained and extensions to more general univariate regressors and vector AR(1) regressors, where OLS may no longer be asymptotically efficient, are provided. In simulations the RLRT maintains size well, is robust to non-normal errors and has uniformly higher power than the Jansson-Moreira test with gains that can be substantial. The Campbell-Yogo Bonferroni Q test is found to have size distortions and can be signicantly oversized.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.