Abstract

The Support Vector Machine or SVM classifier is one of the machine learning algorithms whose job is to predict data. Traditional classifier has limitations in the process of training large-scale data, tends to be slow. This study aims to increase the efficiency of the SVM classifier using a fractional gradient descent optimization algorithm, so that the speed of the data training process can be increased when using large-scale data. There are ten numerical data sets used in the simulation that are used to test the performance of the SVM classifier that has been optimized using the Caputo type fractional gradient descent algorithm. In this paper, we use the Caputo derivative formula to calculate the fractional-order gradient descent from the error function with respect to weights and obtain a deterministic convergence to increase the speed of the Caputo type fractional-order derivative convergence. The test results show that the optimized SVM classifier achieves a faster convergence time with iterations and a small error value. For further research, the optimized SVM linear classifier with fractional gradient descent is implemented on the problem of unbalanced class data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call