Abstract
Machine learning techniques have one of their main objectives to reduce the generalized prediction error. Support vector models have as a main challenge the choice of an appropriate kernel function, as well as the estimation of its hyperparameters. Such procedures are usually performed through some tests and tuning processes which require a high computational performance. In contrast, ensemble methods present a good approach to combine several models which result in a greater predictive capacity. In this paper, we propose a new ensemble method to support vector regression, namely regression random machines. The proposed method eliminates the need to choose the best kernel function during the tuning process using a random mixture of kernel functions combined with a properly bagging ensemble which considers the strength and agreement of the single models. The results demonstrated a good predictive performance through lower generalization error which overlaps the single and bagged versions of support vector models with different kernels. The usefulness of the proposed method is illustrated by simulation studies that were realized over eight artificial scenarios and twenty-seven real-world applications.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have