Abstract
In the paper we propose an algorithm for regressors (features, basis functions) selection in linear regression problems. To do this we use continuous generalization of known Akaike information criterion (AIC). We develop a method for AIC optimization w.r.t. individual regularization coefficients. Each coefficient defines the relevance degree of the corresponding regressor. We provide the experimental results, which prove that the proposed approach can be considered as a non-Bayesian analog of automatic relevance determination (ARD) approach and marginal likelihood optimization used in Relevance Vector Regression (RVR). The key difference of new approach is its ability to find zero regularization coefficients. We hope that this helps to avoid type-II overfitting (underfitting) which is reported for RVR. In the paper we also show that in some special case both methods become identical.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.