Abstract

This paper studies Bayesian variable selection in linear models with general spherically symmetric error distributions. We construct the posterior odds based on a separable prior, which arises as a class of mixtures of Gaussian densities. The posterior odds for comparing among nonnull models are shown to be independent of the error distribution, if this is spherically symmetric. Because of this invariance, we refer to our method as a robust Bayesian variable selection method. We demonstrate that our posterior odds have model selection consistency, and that our class of prior functions are the only ones within a large class which are robust in our sense.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call