Abstract

Sparse representation has been proven to be a powerful tool for analysis and processing of signals and images. Whereas the most existing sparse representation methods are based on the synthesis model, this paper addresses sparse representation with the so-called analysis model. The l 1/2 -norm regularizer theory in compressive sensing (CS) shows that the l 1/2 -norm regularizer can yield stronger sparsity-promoting solutions than the l 1 -norm regularizer. In this paper, we propose a novel and efficient algorithm for analysis dictionary learning problem with l 1/2 -norm regularizer as sparsity constraint, which includes two stages: the analysis sparse coding stage and the analysis dictionary update stage. In the analysis sparse coding stage, adaptive half-thresholding is employed to solve the l 1/2 -norm regularizer problem. In the analysis dictionary update stage, the solution can be straightforwardly obtained by solving the related least square problem followed by a projection. According to our simulation study, the main advantage of the proposed algorithms is its greater learning efficiency in different cosparsities.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call