Abstract

In this paper we consider robust parameter estimation based on a certain cross entropy and divergence. The robust estimate is defined as the minimizer of the empirically estimated cross entropy. It is shown that the robust estimate can be regarded as a kind of projection from the viewpoint of a Pythagorean relation based on the divergence. This property implies that the bias caused by outliers can become sufficiently small even in the case of heavy contamination. It is seen that the asymptotic variance of the robust estimator is naturally overweighted in proportion to the ratio of contamination. One may surmise that another form of cross entropy can present the same behavior as that discussed above. It can be proved under some conditions that no cross entropy can present the same behavior except for the cross entropy considered here and its monotone transformation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call