Abstract

AbstractWhat is a good prior? Actual prior knowledge should be used, but for complex models this is often not easily available. The knowledge can be in the form of symmetry assumptions, and then the choice will typically be an improper prior. Also more generally, it is quite common to choose improper priors. Motivated by this we consider a theoretical framework for statistics that includes both improper priors and improper posteriors. Knowledge is then represented by a possibly unbounded measure with interpretation as explained by Rényi in 1955. The main mathematical result here is a constructive proof of existence of a transformation from prior to posterior knowledge. The posterior always exists and is uniquely defined by the prior, the observed data, and the statistical model. The transformation is, as it should be, an extension of conventional Bayesian inference as defined by the axioms of Kolmogorov. It is an extension since the novel construction is valid also when replacing the axioms of Kolmogorov by the axioms of Rényi for a conditional probability space. A concrete case based on Markov Chain Monte Carlo simulations and data for different species of tropical butterflies illustrate that an improper posterior may appear naturally and is useful. The theory is also exemplified by more elementary examples.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call