Abstract

Support vector machines (SVMs), as a powerful technique for classification, are becoming increasingly popular in a wide range of applications. This is simply due to their robustness against several types of model assumptions violations and outliers. The Kernel-based SVM are very useful to capture non-linear patterns in the data, and for classification. However, this kernel-based method could become computationally very challenging because it increases the required time to train data. This increase in computational time is mainly due to the appearance of the kernel in solving the quadratic optimization problem (QOP). In order to tackle this computational complexity, we propose a novel method based on the low-rank approximation, by adapting a truncated Mercer series to the kernels. The quadratic optimization problem in the structure of kernel-based SVM will then be replaced with a much simpler optimization problem. In the proposed approach, the required time for the vector computations and matrix decompositions will be much faster such that these changes lead to efficiently resolve the QOP and ultimately increase efficiency in classification. We finally present some numerical illustrations based on the ROC curves and other classification performance benchmarks considered in this paper to assess the performance of the proposed low-rank approximation to the kernel in SVM structure. The results suggest considerable efficiency improvement has been observed in classification with significant reduction in computational time required to train and forecast the stock market index (S&P 500 index) and promoter recognition in DNA sequences.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call