Abstract
This paper deals with the problem of overcomplete transform learning. An alternating minimization based procedure is proposed for solving the formulated sparsifying transform learning problem. A closed-form solution is derived for the minimization involved in transform update stage. Compared with existing ones, our proposed algorithm significantly reduces the computation complexity. Experiments and simulations are carried out with synthetic data and real images to demonstrate the superiority of the proposed approach in terms of the averaged representation and denoising errors, the percentage of successful and meaningful recovery of the analysis dictionary, and, more significantly, the computation efficiency.
Highlights
Signal representations play a role of cornerstones in signal processing
We focus on the optimized backward greedy (OBG) based denoising approach as comparisons are given between proposed AlgProposed and AlgAKSVD
We present some experiments to illustrate the performance of our proposed algorithm AlgProposed and compare it with that by AlgAKSVD given in [12]
Summary
Signal representations play a role of cornerstones in signal processing. They have experienced the evolution from the Fourier transforms, which are traced back to the later part of the 19th century, to the wavelet expansions in the 1980s and the sparse representations which have been one of the hotspots in signal processing community for the last two decades. Instead of the traditional expansions in terms of bases and frames, sparse redundant representations look for the best approximation of a signal vector with a linear combination of few atoms from an overcomplete set of well-designed vectors [1]. This topic is closely related to the sparsifying dictionary learning [2, 3] and the compressed sensing (CS) [4,5,6], an emerging area of research in which the sparsity of the signals to be recovered is a prerequisite.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have