Subspace clustering gains popularity in unsupervised machine learning due to its excellent dimensionality reduction capability and interpretability. Although existing research has made significant progress in clustering performance, it still faces two limitations. The first is the challenge of poor feature representation, which makes it hard to explore global structural sparsity. The second is the issue of long runtime, which makes it computationally inefficient. In this paper, we propose a joint sparse subspace clustering (JSSC) method to capture the structural features in the representation matrix via the ℓ2,0-norm constraint, which encourages sparsity across rows and leads to enhance clustering performance. In algorithms, an optimization scheme based on proximal alternating minimization (PAM) is developed, in which each subproblem has a closed-form solution, thus ensuring the effectiveness. Moreover, the convergence guarantee is rigorously proved in theory. Numerical experiments on image and hyperspectral datasets validate its excellent performance in terms of high accuracy with a high computational speed. In particular, the proposed method improves the clustering accuracy by at least 6.1% and the runtime by at least 14 times compared with competitors on the ORL dataset. The code of our proposed JSSC is available at https://github.com/zhudafa/JSSC.
Read full abstract