Abstract

In this paper a new family of minimum divergence estimators based on the Bregman divergence is proposed, where the defining convex function has an exponential nature. These estimators avoid the necessity of using an intermediate kernel density and many of them also have strong robustness properties. It is further demonstrated that the proposed approach can be extended to construct a class of generalized estimating equations, where the pool of the resultant estimators encompass a large variety of minimum divergence estimators and range from highly robust to fully efficient based on the choice of the tuning parameters. All of the resultant estimators are M-estimators, where the defining functions make explicit use of the form of the parametric model. The properties of these estimators are discussed in detail; the theoretical results are substantiated by simulation and real data examples. It is observed that in many cases, certain robust estimators from the above generalized class provide better compromises between robustness and efficiency compared to the existing standards.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.