Abstract

Large-scale subspace clustering usually drops the requirements of the full similarity matrix and Laplacian matrix but constructs the anchor affinity matrix and uses matrix approximation methods to reduce the clustering complexity. However, the existing anchor affinity matrix calculation methods only consider the global structure of data. Moreover, directly using the anchor affinity matrix to approximate the full Laplacian matrix cannot guarantee the best low-rank approximation, which affects the clustering results. To address these problems, this paper proposes a large-scale non-negative subspace clustering method based on Nyström approximation. Firstly, we modify the objective function of the anchor affinity matrix by adding the local structure term, taking into account both the local and global data characteristics for better affinity learning. Secondly, a matrix iterative update rule is derived to optimize the objective function according to the non-negative Lagrangian relaxation, and its rationality and convergence are proved. Finally, two effective Laplacian matrix decomposition methods based on Nyström approximation are designed to obtain more accurate eigenvectors to improve the clustering quality. The proposed algorithms are tested on various benchmark datasets. The experimental results show that our methods have competitive performance compared with state-of-the-art large-scale clustering algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call