Abstract

Models are often defined through conditional rather than joint distributions, but it can be difficult to check whether the conditional distributions are compatible,i.e.whether there exists a joint probability distribution which generates them. When they are compatible, a Gibbs sampler can be used to sample from this joint distribution. When they are not, the Gibbs sampling algorithm may still be applied, resulting in a “pseudo-Gibbs sampler”. We show its stationary probability distribution to be the optimal compromise between the conditional distributions, in the sense that it minimizes a mean squared misfit between them and its own conditional distributions. This allows us to perform Objective Bayesian analysis of correlation parameters in Kriging models by using univariate conditional Jeffreys-rule posterior distributions instead of the widely used multivariate Jeffreys-rule posterior. This strategy makes the full-Bayesian procedure tractable. Numerical examples show it has near-optimal frequentist performance in terms of prediction interval coverage.

Highlights

  • Speaking, there are two ways to create statistical models for multiple random variables

  • We focus on Simple Kriging, where the Gaussian process is assumed to be stationary with known mean, as opposed to Universal Kriging, which incorporates an unknown mean function

  • It is interesting to compute the average of the coverages for all distributions, whether they are plug-in distributions based on the Maximum Likelihood Estimator (MLE) or Maximum A Posteriori estimator (MAP) estimator, or the predictive distribution based on the full posterior distribution

Read more

Summary

Introduction

There are two ways to create statistical models for multiple random variables. One can either consider them simultaneously and directly define their joint distribution, or one can define a system of conditional distributions. The main problem with the second approach is that conditional distributions may not be compatible. In this context, it means that there exists no joint distribution from which the conditional distributions can all be derived. In the context of a model with a given prior distribution, Dawid and Lauritzen [11] examine the problem of eliciting a compatible prior distribution for a submodel. In the domain of Bayesian Networks, a probability distribution can be compatible or not with a given Directed Acyclic Graph (DAG) [30].

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call