Abstract

This article studies the minimum divergence (MD) class of estimators for econometric models specified through moment restrictions. We show that MD estimators can be obtained as solutions to a tractable lower dimensional optimization problem. This problem is similar to the one solved by the generalized empirical likelihood estimators of Newey and Smith (2004), but it is equivalent to it only for a subclass of divergences. The MD framework provides a coherent testing theory: tests for overidentification and parametric restrictions in this framework can be interpreted as semiparametric versions of Pearson-type goodness of fit tests. The higher order properties of MD estimators are also studied and it is shown that MD estimators that have the same higher order bias as the empirical likelihood (EL) estimator also share the same higher order mean square error and are all higher order efficient. We identify members of the MD class that are not only higher order efficient, but also, unlike the EL estimator, well behaved when the moment restrictions are misspecified.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call