Abstract

In objective Bayesian model selection, a well-known problem is that standard non-informative prior distributions cannot be used to obtain a sensible outcome of the Bayes factor because these priors are improper. The use of a small part of the data, i.e., a training sample, to obtain a proper posterior prior distribution has become a popular method to resolve this issue and seems to result in reasonable outcomes of default Bayes factors, such as the intrinsic Bayes factor or a Bayes factor based on the empirical expected-posterior prior. In this paper, it will be illustrated that such default methods may not result in sensible outcomes when evaluating inequality constrained models that are supported by the data. To resolve this issue, a default method is proposed for constructing so-called constrained posterior priors, which are inspired by the symmetrical intrinsic priors discussed by Berger and Mortera (1999) for a simple inequality constrained model selection problem. The resulting Bayes factors can be called “balanced” because model complexity of inequality constrained models is incorporated according to a specific definition that is presented in this paper.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.