Abstract

Sparse subspace clustering (SSC), a seminal clustering method, has demonstrated remarkable performance by effectively solving the data sparsity problem. However, it is not without its limitations. Key among these is the difficulty of incremental learning with the original SSC, accompanied by a computationally demanding recalculation process that constrains its scalability to large datasets. Moreover, the conventional SSC framework considers dictionary construction, affinity matrix learning and clustering as separate stages, potentially leading to suboptimal dictionaries and affinity matrices for clustering. To address these challenges, we present a novel clustering approach, called SSCNet, which leverages differentiable programming. Specifically, we redefine and generalize the optimization procedure of the linearized alternating direction method of multipliers (ADMM), framing it as a multi-block deep neural network, where each block corresponds to a linearized ADMM iteration step. This reformulation is used to address the SSC problem. We then use a shallow spectral embedding network as an unambiguous and differentiable module to approximate the eigenvalue decomposition. Finally, we incorporate a self-supervised structure to mitigate the non-differentiability inherent in k-means to achieve the final clustering results. In essence, we assign unique objectives to different modules and jointly optimize all module parameters using stochastic gradient descent. Due to the high efficiency of the optimization process, SSCNet can be easily applied to large-scale datasets. Experimental evaluations on several benchmarks confirm that our method outperforms traditional state-of-the-art approaches.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call