Low-rank representation (LRR) is a classic subspace clustering (SC) algorithm, and many LRR-based methods have been proposed. Generally, LRR-based methods use denoized data as dictionaries for data reconstruction purpose. However, the dictionaries used in LRR-based algorithms are fixed, leading to poor clustering performance. In addition, most of these methods assume that the input data are linearly correlated. However, in practice, data are mostly nonlinearly correlated. To address these problems, we propose a novel adaptive kernel dictionary-based LRR (AKDLRR) method for SC. Specifically, to explore nonlinear information, the given data are mapped to the Hilbert space via the kernel technique. The dictionary in AKDLRR is not fixed; it adaptively learns from the data in the kernel space, making AKDLRR robust to noise and yielding good clustering performance. To solve the AKDLRR model, an efficient procedure including an alternative optimization strategy is proposed. In addition, a theoretical analysis of the convergence performance of AKDLRR is presented, which reveals that AKDLRR can converge in at most three iterations under certain conditions. The experimental results show that AKDLRR can achieve the best clustering performance and has excellent speed in comparison with other algorithms.
Read full abstract