Abstract

AbstractThe performance of Support Vector Machine (SVM) is significantly affected by model parameters. One commonly used parameters selection method of SVM, Grid search (GS) method, is very time consuming. Present paper introduces Uniform Design (UD) and Support Vector Regression (SVR) method to reduce the computation cost of traditional GS method: the error bounds of SVM are only computed on some nodes that are selected by UD method, then a Support Vector Regression (SVR) are trained by the computation results. Subsequently, the values of error bound of SVM on other nodes are estimated by the SVR function and the optimized parameters can be selected based on the estimated results. Experiments on seven standard datasets show that parameters selected by proposed method can result in similar test error rate as that obtained by conventional GS method, while the computation cost can be reduced at most from o( n m) to o(n), where m is the number of parameters, n is the number of levels of each parameter.KeywordsSupport Vector MachineSupport Vector RegressionSupport Vector Machine ClassifierUniform DesignSupport Vector Machine RegressionThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.