Abstract

We develop analogs of a class of weighted empirical minimum distance estimators of the underlying parameters in errors-in-variables linear regression models, when the regression error distribution and the conditional distribution of conditionally centered measurement error, given the surrogate, are symmetric around the origin. This class of estimators is defined as the minimizers of integrals of the square of a certain symmetrized weighted empirical process of the residuals. It includes the least absolute deviation (LAD) and an analog of the Hodges–Lehmann (H–L) estimators. In this paper we first develop this class of estimators when the distributions of the true covariates and measurement errors are known, and then extend them to the case when these distributions are unknown but validation data is available. In the case of Gaussian errors and covariates, the Pitman’s asymptotic relative efficiency of the LAD and analog of the H–L estimators, relative to the bias corrected least square estimator, tends to infinity as the variances of the components of the measurement error vector tend to infinity. The findings of a simulation study that is included also show significant superiority of these two estimators over the bias corrected least squares estimator.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call