Abstract
Traditional graph clustering methods consist of two sequential steps, i.e., constructing an affinity matrix from the original data and then performing spectral clustering on the resulting affinity matrix. This two-step strategy achieves optimal solution for each step separately, but cannot guarantee that it will obtain the globally optimal clustering results. Moreover, the affinity matrix directly learned from the original data will seriously affect the clustering performance, since high-dimensional data are usually noisy and may contain redundancy. To address the above issues, this paper proposes a Low-rank Sparse Subspace (LSS) clustering method via dynamically learning the affinity matrix from low-dimensional space of the original data. Specifically, we learn a transformation matrix to project the original data to their low-dimensional space, by conducting feature selection and subspace learning in the sample self-representation framework. Then, we utilize the rank constraint and the affinity matrix directly obtained from the original data to construct a dynamic and intrinsic affinity matrix. Moreover, each of these three matrices is updated iteratively while fixing the other two. In this way, the affinity matrix learned from the low-dimensional space is the final clustering results. Extensive experiments are conducted on both synthetic and real datasets to show that our proposed LSS method outperforms the state-of-the-art clustering methods.
Accepted Version
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have