Abstract

Sparse coding is a popular technique for achieving compact data representation and has been used in many applications. However, the instability issue often causes degeneration in practice and thus attracts a lot of studies. While the traditional graph sparse coding preserves the neighborhood structure of the data, this study integrates the low-rank representation(LRR) to fix the inconsistency of sparse coding by holding the subspace structures of the high-dimensional observations. The proposed method is dubbed low-rank graph regularized sparse coding (LogSC), which learns sparse codes and low-rank representations jointly rather than the traditional two-step approach. Since the two data representations share a dictionary matrix, the resulted sparse representation on this dictionary could be benefited from LRR. We solved the optimization problem of LogSC by using the linearized alternating direction method with adaptive penalty. Experimental results show the proposed method is discriminative in feature learning and robust to various noises. This work provides a one-step approach to integrating graph embedding in representation learning.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.