Abstract

We show a link between Bayesian inference and information theory that is useful for model selection, assessment of information entropy and experimental design. We align Bayesian model evidence (BME) with relative entropy and cross entropy in order to simplify computations using prior-based (Monte Carlo) or posterior-based (Markov chain Monte Carlo) BME estimates. On the one hand, we demonstrate how Bayesian model selection can profit from information theory to estimate BME values via posterior-based techniques. Hence, we use various assumptions including relations to several information criteria. On the other hand, we demonstrate how relative entropy can profit from BME to assess information entropy during Bayesian updating and to assess utility in Bayesian experimental design. Specifically, we emphasize that relative entropy can be computed avoiding unnecessary multidimensional integration from both prior and posterior-based sampling techniques. Prior-based computation does not require any assumptions, however posterior-based estimates require at least one assumption. We illustrate the performance of the discussed estimates of BME, information entropy and experiment utility using a transparent, non-linear example. The multivariate Gaussian posterior estimate includes least assumptions and shows the best performance for BME estimation, information entropy and experiment utility from posterior-based sampling.

Highlights

  • Probability theory and stochastic analysis provide powerful tools for model selection, parameter inference, data assimilation and experimental design

  • The current paper shows the link between Bayesian inference and information theory

  • We demonstrate how Bayesian model selection can profit from information theory to estimate Bayesian model evidence (BME)

Read more

Summary

Introduction

Probability theory and stochastic analysis provide powerful tools for model selection, parameter inference, data assimilation and experimental design. This connection can be employed for model selection, assessment of information entropy and experimental design. The scope of the current paper is to align BME with entropies from information theory in order to simplify BME and relative entropy estimations using either prior or posterior-based sampling techniques. We emphasize that the information entropy and the predicted utility of an experiment can be computed avoiding unnecessary multidimensional integration for both prior and posterior-based sampling approaches. Multivariate Gaussian posterior estimates similar to the Gelfand and Dey approach [21], include least assumptions among all approximates discussed in Section 3 and offer a suitable assessment of BME and information entropy using posterior-based approaches.

Bayesian Inference
Information Theory
From Bayesian Inference to Information Theory
Bayesian Model Selection
Model Evidence via Posterior Density Estimates
Model Evidence via Dirac at the Maximum a Posteriori Estimate
Model Evidence via the Chib Estimate
Model Evidence via the Akaike Information Criterion
Model Evidence via Multivariate Gaussian Posterior Estimates
Model Evidence via the Kashyap Information Criterion Correction
Model Evidence via the Schwarz Information Criterion Correction
Model Evidence via the Gelfand and Dey Estimate
Bayesian View on the Information Gain
Information Entropy during Bayesian Inference
Bayesian Experimental Design and Information Gain
Scenario Set Up
Information Entropy and Bayesian Experimental Design
Summary and Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.