Abstract

Multi-label support vector machine with a zero label (Rank-SVMz) is an effective SVM-type technique for multi-label classification, which is formulated as a quadratic programming (QP) problem with several disjoint equality constraints and lots of box ones, and then is solved by Frank–Wolfe method (FWM) embedded one-versus-rest (OVR) decomposition trick. However, it is still highly desirable to speed up the training and testing procedures of Rank-SVMz for many real world applications. Due to the special disjoint equality constraints, all variables to be solved in Rank-SVMz are naturally divided into several blocks via OVR technique. Therefore we propose a random block coordinate descent method (RBCDM) for Rank-SVMz in this paper. At each iteration, an entire QP problem is divided into a series of small-scale QP sub-problems, and then each QP sub-problem with a single equality constraint and many box ones is solved by sequential minimization optimization (SMO) used in binary SVM. The theoretical analysis shows that RBCDM has a much lower time complexity than FWM for Rank-SVMz. Our experimental results on six benchmark data sets demonstrate that, on the average, RBCDM runs 11 times faster, produces 12% fewer support vectors, and achieves a better classification performance than FWM for Rank-SVMz. Therefore Rank-SVMz with RBCDM is a powerful candidate for multi-label classification.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.