Abstract

An instance crucial to most problems in signal processing is the selection of the order of a candidate model. Among the different exciting criteria, the two most popular model selection criteria in the signal processing literature have been the Akaike's criterion AIC and the Bayesian information criterion BIC. These criteria are similar in form in that they consist of data and penalty terms. Different approaches have been used to derive these criteria. However, none of them take into account the prior information concerning the parameters of the model. In this paper, an new approach for model selection, that takes into account the prior information on the model parameters, is proposed. Using the proposed approach and depending on the nature of the prior on the model parameters, two new information criteria are proposed for univariate linear regression model selection. We use the term "information criteria" because their derivation is based on the Kullback-Leibler divergence.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.