Abstract

A principled approach to machine learning (ML) problems because of its mathematical foundations in statistical learning theory, support vector machines (SVM), a non-parametric method, require all the data to be available during the training phase. However, once the model parameters are identified, SVM relies only, for future prediction, on a subset of these training instances, called support vectors (SV). The SVM model is mathematically written as a weighted sum of these SV whose number, rather than the dimensionality of the input space, defines SVM's complexity. Since the final number of these SV can be up to half the size of the training dataset, SVM becomes challenged to run on energy aware computing platforms. We propose in this work Knee-Cut SVM (KCSVM) and Knee-Cut Ordinal Optimization inspired SVM (KCOOSVM) that use a soft trick of ordered kernel values and uniform subsampling to reduce SVM's prediction computational complexity while maintaining an acceptable impact on its generalization capability. When tested on several databases from UCL KCSVM and KCOOSVM produced promising results, comparable to similar published algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call