Nonlinear subspace clustering based on a feed-forward neural network has been demonstrated to provide better clustering accuracy than some advanced subspace clustering algorithms. While this approach demonstrates impressive outcomes, it involves a balance between effectiveness and computational cost. In this study, we employ a functional link neural network to transform data samples into a nonlinear domain. Subsequently, we acquire a self-representation matrix through a learning mechanism that builds upon the mapped samples. As the functional link neural network is a single-layer neural network, our proposed method achieves high computational efficiency while ensuring desirable clustering performance. By incorporating the local similarity regularization to enhance the grouping effect, our proposed method further improves the quality of the clustering results. We name our method as Functional Link Neural Network Subspace Clustering (FLNNSC). Furthermore, we propose a convex combination subspace clustering scheme that combines a linear subspace clustering method with the functional link neural network subspace clustering approach. This combination method is named as Convex Combination Subspace Clustering (CCSC), which allows for a dynamic balance between linear and nonlinear representations. Extensive experiments conducted on four widely used datasets, including Extended Yale B, USPS, COIL20, and ORL, demonstrate that both FLNNSC and CCSC outperform several state-of-art subspace clustering methods in terms of clustering accuracy. Our affinity graph experiments reveal that FLNNSC exhibits clear block diagonal structures. We provide recommendations for hyperparameters in FLNNSC by performing a parameter sensitivity analysis, and empirically verify the convergence of FLNNSC. Additionally, we show that FLNNSC has a lower computational cost compared to two high-performing methods.
Read full abstract