Abstract
We study several theoretical properties of Jeffreys’s prior for binomial regression models. We show that Jeffreys’s prior is symmetric and unimodal for a class of binomial regression models. We characterize the tail behavior of Jeffreys’s prior by comparing it with the multivariate t and normal distributions under the commonly used logistic, probit, and complementary log–log regression models. We also show that the prior and posterior normalizing constants under Jeffreys’s prior are linear transformation-invariant in the covariates. We further establish an interesting theoretical connection between the Bayes information criterion and the induced dimension penalty term using Jeffreys’s prior for binomial regression models with general links in variable selection problems. Moreover, we develop an importance sampling algorithm for carrying out prior and posterior computations under Jeffreys’s prior. We analyze a real data set to illustrate the proposed methodology.
Accepted Version (
Free)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have