Abstract

We consider the problem of variable selection in linear regression using mixtures of g-priors. A number of mixtures have been proposed in the literature which work well, especially when the number of regressors p is fixed. In this paper, we propose a mixture of g-priors suitable for the case when p grows with the sample size n, more specifically when $$p=O(n^b)$$ , $$0<b<1$$ . The marginal density based on the proposed mixture has a nice approximation with a closed form expression, which makes application of the method as tractable as an information criterion-based method. The proposed method satisfies fundamental properties like model selection consistency when the true model lies in the model space, and also consistency in an appropriate sense, under misspecified models setup. The method is quite robust in the sense that the above properties are not confined to normal linear models; they continue to hold under reasonable conditions for a general class of error distributions. Finally, we compare the performance of the proposed prior theoretically with that of some other mixtures of g-priors. We also compare it with several other Bayesian methods of model selection using simulated data sets. Theoretically, as well as in simulations, it emerges that unlike most of the other methods of model selection, the proposed prior is competent enough while selecting the true model irrespective of its dimension.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.