Abstract

The maximum likelihood estimator (MLE) is known to be asymptotically efficient but it is not robust with respect to outliers and model misspecification. Basu et al. (1998) suggest a family of divergence measures called density power divergences , which is based on density. Each measure in this family is indexed by a single tuning parameter a. This parameter controls the trade-off between asymptotic efficiency and robustness of the estimators. The L₂-distance and the Kullback-Leibler divergence belong to this family. With a appropriately chosen tuning parameter, one can get a minimum density power divergence estimator (MDPDE). For 0 < a < 1, the estimator is in between MLE and minimum L₂-distance estimator L₂E. Note that MLE is efficient-but-nonrobust while L₂E is robust-butinefficient. Hong, Kim (2001) suggest an automatic selection of a. In this paper we will suggest a confidence interval using MDPDE when the data set is contaminated. Bootstrap resampling will be used to obtain the confidence interval. It is expected that the resulting confidence intervals (called MDPD bootstrap confidence intervals) are robust with respect to the outliers. The performances of the MDPDE bootstrap confidence intervals are investigated via simulation study.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call