Abstract

There exist many examples where Bayesian predictive distributions are more appropriate than plug-in distributions. When we use Bayesian procedure, the choice of prior distributions is a serious problem. Non-informative prior distributions or vague prior distributions are often adopted to construct Bayesian predictive distributions. There exist many studies on the relation between priors and Bayes estimators. Here, we investigate the corresponding relation between priors and Bayesian predictive distributions. We adopt Kullback-Leibler divergence from the true distribution to a predictive distribution as a loss function . We consider invariant predictive distributions. When a model has a group structure, the right invariant measure is often recommended as a noninformative prior. It is shown that Bayesian predictive distributions based on right invariant measures are the best invariant predictive distributions. We touch on shrinkage method of constructing predictive distributions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call