Abstract

The selection of hyper-parameters in support vector regression algorithms (SVMr) is an essential process in the training of these learning machines. Unfortunately, there is not an exact method to obtain the optimal values of SVMr hyper-parameters. Therefore, it is necessary to use a search algorithm and sometimes a validation method in order to find the best combination of hyper-parameters. The problem is that the SVMr training time can be huge in large training databases if standard search algorithms and validation methods (such as grid search and K-fold cross validation), are used. In this paper we propose two novel validation methods which reduce the SVMr training time, maintaining the accuracy of the final machine. We show the good performance of both methods in the standard SVMr with 3 hyper-parameters (where the hyper-parameters search is usually carried out by means of a grid search) and also in the extension to multi-parametric kernels, where meta-heuristic approaches such as evolutionary algorithms must be used to look for the best set of SVMr hyper-parameters. In all cases the new validation methods have provided very good results in terms of training time, without affecting the final SVMr accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call