Abstract

Kernel ridge regression (KRR) is a nonlinear extension of the ridge regression. The performance of the KRR depends on its hyperparameters such as a penalty factor C, and RBF kernel parameter sigma. We employ a method called MCV-KRR which optimizes the KRR hyperparameters so that a cross-validation error is minimized. This method becomes equivalent to a predictive approach to Gaussian process. Since the cost of KRR training is O(N3) where N is a data size, to reduce this complexity, some sparse approximation of the KRR is recently studied. In this paper, we apply the minimum cross-validation (MCV) approach to such sparse approximation. Our experiments show the MCV with the sparse approximation of the KRR can achieve almost the same generalization performance as the MCV-KRR with much lower cost.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.