Abstract

Model Selection is a ubiquitous task in the model building process that seeks a parsimonious model to explain patterns in data. Bayesian Information Criteria (BIC) is one such tool that is used in applications for its property of consistency, where this term is traditionally synonymous with asymptotic consistency. This article introduces the concept of exact consistency in model selection (which places an upper bound on the probability of selecting under-fitted models) to elucidate finite-sample properties of asymptotically consistent criteria. A new class of criteria gBIC is proposed with conditions for candidate model priors that guarantee exact and asymptotic consistency. gBIC provides a theoretical means to assess the consistency of existing BIC variants by simply checking functional properties of their respective penalty components. It is shown that gBIC includes sample size adjusted BIC (SABIC) as a special case, and improves on BIC with the proposed model-modified BIC (mBIC). In comparison to BIC, mBIC guarantees higher success rates for selecting models that match or contain the data generating model at any sample size for linear regression and time series. Lastly, mBIC can easily be adopted in linear regression and time series via simple algebraic adjustments to BIC/SABIC values.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call