Abstract
The Kullback information criterion (KIC) (Cavanaugh, J.E., Statistics and Probability Letters, vol.42, p.333-43, 1999) and the bias corrected version, KICc, (Seghouane, A.-K. et al., Proc. ICASSP, p.145-8, 2003) are two methods for statistical model selection of regression variables and autoregressive models. Both criteria may be viewed as estimators of the Kullback symmetric divergence between the true model and the fitted approximating model. The bias of KIC and KICc is studied in the underfitting case, where none of the candidate models includes the true model. Here, only normal linear regression models are considered, where an exact expression of the bias is obtained for KIC and KICc. The bias of KICc is often smaller, in most cases drastically smaller, than KIC. A simulation study, in which the true model is of infinite order polynomial expansion, shows that, in small and moderate sample size, KICc provides a better model selection than KIC. Furthermore KICc outperforms the two well-known criteria, AIC and MDL.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.