Abstract

Multi-label learning has drawn wide attention for the last decade. To exploit the correlation between labels, a multi-label classifier based on the nuclear norm has been proposed recently, which joints Ranking support vector machine (RankSVM) and Binary Relevance (BR) with robust Low-rank learning (RBRL). Therefore, it has satisfactory classification outcomes in the application. Nonetheless, tackling the large-scale problem still remains a challenge for RBRL. Motivated by this, a Subspace Screening Rule (SSR) for RBRL is proposed to accelerate the solving process. Its primary strategy is to reduce the size of the matrix variable to be estimated, based on the fact that a low-rank matrix can be represented by a few subspaces. Specifically, at each iteration we delete a majority of subspaces with zero coefficients in the optimal solution by matrix decomposition and optimality condition. Then, we solve the small-scale reduced problem rather than the initial large-scale matrix problem. An excellent acceleration effect is obtained. To further improve the solving speed, Approximate Singular Value Decomposition (ASVD) and Accelerated Proximal Gradient (APG) are employed in the different stages. Extensive experiments on seven benchmark datasets as well as an artificial dataset demonstrate the efficiency of SSR.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call