Abstract

In this paper, Support Vector Regression (SVR) training models using three different kernels: polynomial, Radial Basis Function (RBF), and mixed kernels, are constructed to demonstrate the training performance of unarranged data obtained from 32 virtual 3-D computer models. The 32 samples used as input data for training the three SVR models are represented by the coordination value sets of points extracted from 3-D models built by the 3-D software according to the shapes of 32 actual hairdryer products. To train the SVR model, an adjective (streamline) is used to evaluate all the 32 samples by 37 subjects. Then the scores of all the subjects are averaged to be the target values of the training models. In addition, a technique called k-fold cross-validation (C-V) is used to find the optimal parameter combination for optimizing the SVR models. The performance of the SVR using these three kernels to estimate the product image values is determined by the values of the Root Mean Square Error (RMSE). The results show that the optimal SVR model using the polynomial kernel performed better than the one using the RBF kernel. However, it is important to note that the mixed kernel had the best performance of the three. It is also shown in this study that the single RBF has a local characteristic and cannot process the broadly distributed data well. It can, however, be used to improve the power of the SVR by combining with the polynomial kernel.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call