Abstract

Bayesian predictive densities when the observed data x and the target variable y to be predicted have different distributions are investigated by using the framework of information geometry. The performance of predictive densities is evaluated by the Kullback–Leibler divergence. The parametric models are formulated as Riemannian manifolds. In the conventional setting in which x and y have the same distribution, the Fisher–Rao metric and the Jeffreys prior play essential roles. In the present setting in which x and y have different distributions, a new metric, which we call the predictive metric, constructed by using the Fisher information matrices of x and y, and the volume element based on the predictive metric play the corresponding roles. It is shown that Bayesian predictive densities based on priors constructed by using non-constant positive superharmonic functions with respect to the predictive metric asymptotically dominate those based on the volume element prior of the predictive metric.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.