Abstract

Confidence intervals for the maximum likelihood estimates (MLEs) are commonly used in statistical inference. To accurately construct such confidence intervals, one typically needs to know the distribution of the MLE. Standard statistical theory says normalized MLE is asymptotically normal with mean zero and variance being a function of the Fisher Information Matrix (FIM) at the unknown parameter. Two common estimates for the variance of MLE are the observed FIM (same as Hessian of negative log-likelihood) and the expected FIM, both of which are evaluated at the MLE given sample data. We show that, under reasonable conditions, the expected FIM tends to outperform the observed FIM under a mean-squared error criterion. This result suggests that, under certain conditions, the expected FIM is a better estimate for the variance of MLE when used in confidence interval calculations.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call