Abstract

Summary We propose a new robust hypothesis test for (possibly non-linear) constraints on M-estimators with possibly non-differentiable estimating functions. The proposed test employs a random normalizing matrix computed from recursive M-estimators to eliminate the nuisance parameters arising from the asymptotic covariance matrix. It does not require consistent estimation of any nuisance parameters, in contrast with the conventional heteroscedasticity-autocorrelation consistent (HAC)-type test and the Kiefer–Vogelsang–Bunzel (KVB)-type test. Our test reduces to the KVB-type test in simple location models with ordinary least-squares estimation, so the error in the rejection probability of our test in a Gaussian location model is Op(T−1logT). We discuss robust testing in quantile regression, and censored regression models in detail. In simulation studies, we find that our test has better size control and better finite sample power than the HAC-type and KVB-type tests.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call