Abstract

Support vector regression (SVR) is a nonlinear prediction method using kernel functions and has been widely applied to real-world problems. Although the accuracy of an effectively tuned SVR is high, its performance strongly depends on hyper parameters. Therefore, the determination of the parameters is important when applying SVR to real-world problems. Although the optimum parameters are usually determined by an exhaustive grid search, using this method is not realistic when the sample size is considerably large. To decrease the computational time required to determine the optimum parameters, we employ orthogonal array and propose two efficient methods for SVR parameter tuning based on variable selection in Taguchi method. The proposed methods can reduce the computational time to approximately one-twelfth of that taken by a grid search method. We also validated the actual computational time and accuracy of the proposed methods by applying it to five real datasets in UCI repository.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call