Abstract

The finite sample information criterion (FSIC) is introduced as an estimator for the Kullback-Leibler discrepancy of an autoregressive time series. It is derived especially for order selection in finite samples, where model orders are greater than one tenth of the sample size. It uses a theoretical expression for the ratio between the squared prediction error and the residual variance its the penalty factor for additional parameters in a model. This ratio can be found with the finite sample theory for autoregressive estimation, which is based on empirical approximations for the variance of parameters. It takes into account the different number of degrees of freedom that are available effectively in the various algorithms for autoregressive parameter estimation. The performance of FSIC has been compared with existing order selection criteria in simulation experiments using four different estimation methods. In finite samples, the FSIC selects model orders with a better objective quality for all estimation methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.