Abstract

Twin-KSVC, as a novel multi-class classification algorithm, aims at finding two nonparallel hyper-planes for the two focused classes of samples by solving a pair of smaller-sized quadratic programming problems (QPPs), which makes the learning speed faster than other multi-class classification algorithms. However, the local information of samples is ignored, and then each sample shares the same weight when constructing the separating hyper-planes. In fact, they have different influences on the separating hyper-planes. Inspired by the studies above, we propose a K-nearest neighbor (KNN)-based weighted multi-class twin support vector machine (KWMTSVM) in this paper. Weight matrix W is employed in the objective function to exploit the local information of intra-class. Meanwhile, both weight vectors f and h are introduced into the constraints to exploit the information of inter-class. When component fj=0 or hk=0, it implies that the j-th or k-th constraint is redundant. Removing these redundant constraints can effectively improve the computational speed of the classifier. Experimental results on eleven benchmark datasets and ABCD dataset demonstrate the validity of our proposed algorithm.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.