Abstract

Sparse representation is efficient to approximately recover signals by a linear composition of a few bases from an over-complete dictionary. However, in the scenario of data compression, its efficiency and popularity are hindered due to the extra overhead for encoding the sparse coefficients. Therefore, how to establish an accurate rate model in sparse coding and dictionary learning becomes meaningful, which has been not fully exploited in the context of sparse representation. According to the Shannon entropy inequality, the variance of data source can bound its entropy, thus can reflect the actual coding bits. Therefore, a Globally Variance-Constrained Sparse Representation (GVCSR) model is proposed, where a variance-constrained rate term is introduced to the conventional sparse representation. To solve the non-convex optimization problem, we employ the Alternating Direction Method of Multipliers (ADMM) for sparse coding and dictionary learning, both of which have shown state-of-the-art rate-distortion performance in image representation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.