Abstract

Most subspace clustering methods construct the similarity matrix based on self-expressive property and apply the spectral relaxation on the similarity matrix to get the final clusters. Despite the advantages of this framework, it has two limitations that are easily ignored. Firstly, the original self-expressive model only considers the global structure of data, and the ubiquitous local structure among data is not paid enough attention. Secondly, spectral relaxation is naturally suitable for 2-way clustering tasks, but when dealing with multi-way clustering tasks, the assignment of cluster members becomes indirect and requires additional steps. To overcome these problems, this paper proposes a global and local structure preserving nonnegative subspace clustering method, which learns data similarities and cluster indicators in a mutually enhanced way within a unified framework. Besides, the model is extended to kernel space to strengthen its capability of dealing with nonlinear data structures. For optimizing the objective function of the method, multiplicative updating rules based on nonnegative Lagrangian relaxation are developed, and the convergence is guaranteed in theory. Abundant experiments have shown that the proposed model is better than many advanced clustering methods in most cases.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call