Abstract

Abstract Ensemble regression method shows better performance than single regression since ensemble regression method can combine several single regression methods together to improve accuracy and stability of a single regressor. In this paper, we propose a novel kernel ensemble regression method by minimizing total least square loss in multiple Reproducing Kernel Hilbert Spaces (RKHSs). Base kernel regressors are co-optimized and weighted to form an ensemble regressor. In this way, the problem of finding suitable kernel types and their parameters in base kernel regressor is solved in the ensemble regression framework. Experimental results on several datasets, such as artificial datasets, UCI regression and classification datasets, show that our proposed approach achieves the lowest regression loss among comparative regression methods such as ridge regression, support vector regression (SVR), gradient boosting, decision tree regression and random forest.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call