Abstract Algorithms for learning overcomplete dictionaries for sparse signal representation are mostly iterative minimization methods that alternate between a sparse coding stage and a dictionary update stage. For most however, the notion of consistency of the learned quantities has not been addressed. Based on the observation that the observed signals can be approximated as a sum of rank one matrices, a new adaptive dictionary learning algorithm is proposed in this paper. It is derived via sequential adaptive penalized rank one matrix approximation where the l1-norm is introduced as a penalty promoting sparsity. The proposed algorithm uses a block coordinate descent approach to consistently estimate the unknowns and has the advantage of having simple closed form solutions for both the sparse coding and dictionary update stages. The consistency properties of both the estimated sparse code and dictionary atom are provided. The performance of the proposed algorithm compared to some state of the art algorithms is illustrated on both simulated data and a real functional magnetic resonance imaging (fMRI) data set from a finger tapping experiment.
Read full abstract