Abstract

Wyner defined the notion of common information of two discrete random variables as the minimum of I(W; X, Y) where W induces conditional independence between X and Y. Its generalization to multiple dependent random variables revealed a surprising monotone property in the number of variables. Motivated by this monotonicity property, this paper explores the application of Wyner's common information to inference problems and its connection with other performance metrics. A central question is that under what conditions Wyner's common information captures the entire information contained in the observations about the inference object under a simple Bayesian model. For infinitely exchangeable random variables, it is shown using the de Finetti-Hewitt-Savage theorem that the common information is asymptotically equal to the information of the inference object. For finite exchangeable random variables, such conclusion is no longer true even for infinitely extendable sequences. However, for some special cases, including both the binary and the Gaussian cases, concrete connection between common information and inference performance metrics can be established even for finite samples.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call