Abstract

Multiple kernel learning (MKL) methods are generally believed to perform better than single kernel learning (SKL) methods in handling nonlinear subspace clustering problem, largely thanks to MKL avoids selecting and tuning a pre-defined kernel. However, previous MKL methods mainly focused on how to define a kernel weighting strategy, but ignored the structural characteristics of the input data in both the original space and the kernel space. In this paper, we first propose a novel graph-based MKL method for subspace clustering, namely, Local Structural Graph and Low-Rank Consensus Multiple Kernel Learning (LLMKL). It jointly learns an optimal affinity graph and a suitable consensus kernel for clustering purpose by elegantly integrating the MKL technology, the global structure in the kernel space, the local structure in the original space, and the Hilbert space self-expressiveness property in a unified optimization model. In particular, to capture the data global structure, we employ a substitute of the desired consensus kernel, and then introduce a low-rank constraint on the substitute to encourage that the structure of linear subspaces is present in the feature space. Moreover, the data local structure is explored by building a complete graph, where each sample is treated as a node, and an edge codes the pairwise affinity between two samples. By such, the consensus kernel learning and the affinity graph learning can promote each other such that the data in resulting Hilbert space are both self-expressive and low-rank. Experiments on both image and text clustering well demonstrate that LLMKL outperforms the state-of-the-art methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call