Abstract

Multiview learning (MVL) concentrates on the problem where each instance is represented by multiple different feature sets. Efficiently exploring and exploiting the common and complementary information among different views remains challenging in MVL. Nevertheless, many existing algorithms deal with multiview problems via pairwise strategies, which limit the exploration of relationships among different views and dramatically increase the computational cost. In this article, we propose a multiview structural large margin classifier (MvSLMC) that simultaneously satisfies the consensus and complementarity principles in all views. Specifically, on the one hand, MvSLMC employs a structural regularization term to promote cohesion within-class and separability between-class in each view. On the other hand, different views provide extra structural information to each other, which favors the diversity of the classifier. Moreover, the introduction of hinge loss in MvSLMC results in sample sparsity, which we leverage to construct a safe screening rule (SSR) for accelerating MvSLMC. To the best of our knowledge, this is the first attempt at safe screening in MVL. Numerical experimental results demonstrate the effectiveness of MvSLMC and its safe acceleration method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call