Abstract

The practice of employing empirical likelihood (EL) components in place of parametric likelihood functions in the construction of Bayesian-type procedures has been well-addressed in the modern statistical literature. The EL prior, a Jeffreys-type prior, which asymptotically maximizes the Shannon mutual information between data and the parameters of interest, is rigorously derived. The focus of the proposed approach is on an integrated Kullback–Leibler distance between the EL-based posterior and prior density functions. The EL prior density is the density function for which the corresponding posterior form is asymptotically negligibly different from the EL. The proposed result can be used to develop a methodology for reducing the asymptotic bias of solutions of general estimating equations and M-estimation schemes by removing the first-order term. This technique is developed in a similar manner to methods employed to reduce the asymptotic bias of maximum likelihood estimates via penalizing the underlying parametric likelihoods by their Jeffreys invariant priors. A real data example related to a study of myocardial infarction illustrates the attractiveness of the proposed technique in practical aspects.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call