Abstract
Recently, Self-Expressive-based Subspace Clustering (SESC) has been widely applied in pattern clustering and machine learning as it aims to learn a representation that can faithfully reflect the correlation between data points. However, most existing SESC methods directly use the original data as the dictionary, which miss the intrinsic structure (e.g., low-rank and nonlinear) of the real-word data. To address this problem, we propose a novel Projection Low-Rank Subspace Clustering (PLRSC) method by integrating feature extraction and subspace clustering into a unified framework. In particular, PLRSC learns a projection transformation to extract the low-dimensional features and utilizes a low-rank regularizer to ensure the informative and important structures of the extracted features. The extracted low-rank features effectively enhance the self-expressive property of the dictionary. Furthermore, we extend PLRSC to a nonlinear version (i.e., NPLRSC) by integrating a nonlinear activator into the projection transformation. NPLRSC cannot only effectively extract features but also guarantee the data structure of the extracted features. The corresponding optimization problem is solved by the Alternating Direction Method (ADM), and we also prove that the algorithm converges to a stationary point. Experimental results on the real-world datasets validate the superior of our model over the existing subspace clustering methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.