Abstract

When all training instances and labels are considered all together in a single optimization problem, multi-label support and core vector machines (i.e., Rank-SVM and Rank-CVM) are formulated as quadratic programming (QP) problems with equality and bounded constraints, whose training procedures have a sub-linear convergence rate. Therefore it is highly desirable to design and implement a novel efficient SVM-type multi-label algorithm. In this paper, through applying pairwise constraints between relevant and irrelevant labels, and defining an approximate ranking loss, we generalize binary Lagrangian support vector machine (LSVM) to construct its multi-label form (Rank-LSVM), resulting into a strictly convex QP problem with non-negative constraints only. Particularly, each training instance is associated with a block of variables and all variables are divided naturally into manageable blocks. Consequently we build an efficient training procedure for Rank-LSVM using random block coordinate descent method with a linear convergence rate. Moreover a heuristic strategy is applied to reduce the number of support vectors. Experimental results on twelve data sets demonstrate that our method works better according to five performance measures, and averagely runs 15 and 107 times faster and has 9 and 15% fewer support vectors, compared with Rank-CVM and Rank-SVM.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.