Abstract

We resolve a useful formulation of the question how a statistician can coherently incorporate the information in a consulted expert’s probability assessment for an event into a personal posterior probability assertion. Using a framework that recognises the total information available as composed of units available only to each of them along with units available to both, we show: that a sufficient statistic for all the information available to both the expert and the statistician is the product of their odds ratios in favour of the event; that the geometric mean of their two probabilities specifies a contour of pairs of assertions in the unit-square that yield the same posterior probability; that the information-combining function is parameterised by an unknown probability for the event conditioned only on the unspecified information common to both the statistician and the expert; and that an assessable mixing distribution over this unspecified probability allows an integrable mixture distribution to represent a computable posterior probability. The exact results allow the identification of the subclass of coherent probabilities that are externally Bayesian operators. This subclass is equivalent to the class of combining functions that honour the principles of uniformity and compromise.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.