Abstract

This paper introduces a new information criterion for model selection, based on a predictive distribution which improves the estimative one. The selection statistic is defined as a first-order estimator for the expected Kullback–Leibler information between the true model and the fitted one, obtained by means of the improved predictive procedure. The criterion turns out to be a simple, non-computationally demanding, alternative to the Takeuchi information criterion. Whenever the information identity holds, the Akaike information criterion is recovered as a particular case. The results are obtained in the case of independent, but not necessarily identically distributed, observations. Some applications, related to exponential families of distributions and regression models, are presented.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call