Abstract

Hypothesis testing is one of the fundamental paradigms of statistical inference. The three canonical hypothesis testing procedures available in the statistical literature are the likelihood ratio (LR) test, the Wald test and the Rao (score) test. All of them have good optimality properties and past research has not identified any of these three procedures to be a clear winner over the other two. However, the classical versions of these tests are based on the maximum likelihood estimator (MLE), which, although the most optimal estimator asymptotically, is known for its lack of robustness under outliers and model misspecification. In the present paper we provide an overview of the analogues of these tests based on the minimum density power divergence estimator (MDPDE), which presents us with an alternative option that is strongly robust and highly efficient. Since these tests have, so far, been mostly studied for univariate responses, here we primarily focus on their performances for several important hypothesis testing problems in the multivariate context under the multivariate normal model family.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.