Abstract
We propose a new approach to constructing robust hypothesis tests based on general M-estimators with possibly non-differentiable estimating functions. The proposed test employs a random normalizing matrix computed using only recursive M-estimators to eliminate the nuisance parameters arising from the asymptotic covariance matrix and hence does not require consistent estimation of any nuisance parameters, in contrast with the conventional HAC-type test and the KVB-type test of Kiefer, Vogelsang, and Bunzel (2000, Econometrica). We also demonstrate that the proposed test reduces to the KVB-type test in simple location models (with OLS estimation). The error in rejection probability of the proposed test in a Gaussian location model is thus Op((log T)/T), where T is the sample size. As examples, we consider robust testing in GMM, quantile regression (QR), and censored regression models. We find that the proposed test is free from any user-chosen parameters for these models, yet the HAC-type test as well as the KVB-type test with optimal GMM, QR, and censored LAD estimators would require nonparametric kernel estimation and hence depends on user-chosen parameters. Our simulations show that the proposed test dominates both the HAC- and KVB-type tests in terms of finite sample size and has power advantage when the latter tests are computed using inappropriate user-chosen parameters.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.