Abstract

Multi-view subspace clustering, which studies the similarities and differences among data in multiple views, is an efficient clustering problem. However, to deal with problems that have non-linear structures and non-Gaussian noise in multi-view data, existing clustering methods relax the original problem convexly. The solutions that are generated by these convex relaxations are not the optimal solutions to the original problem. To overcome this deficiency, this paper presents a robust low-rank kernel multi-view subspace clustering approach that combines the non-convex Schatten p-norm (0<p≤1) regularizer with the “kernel trick”, which can efficiently deal with problems that have non-linear structures in multi-view data via non-convex methods. In addition, the correntropy is introduced into our model, which is a robust measure of the corruptions that are caused by non-Gaussian noise. Moreover, our method can learn the joint subspace representation of all views. Because it learns a low-rank kernel mapping, the data in the feature space are both low-rank and self-expressed. The optimization problems are solved efficiently via an iterative algorithm (HQ-ADMM). This algorithm can ensure that each iteration has closed-form solutions, which simplifies the optimization problems substantially. Experimental comparisons on five real-world datasets demonstrate that the proposed algorithm outperforms state-of-the-art multi-view subspace clustering algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call