Abstract
We propose a fast and efficient algorithm for learning overcomplete dictionary for sparse representation of signals using the nonconvex log-regularizer for sparsity. The special importance of log-regularizer has been recognized in recent studies on sparse modeling. The log-regularizer, however, leads to a nonconvex and nonsmooth optimization problem that is difficult to solve efficiently. In this paper, We propose a method based on a decomposition scheme and alternating optimization that can turn the whole problem into a set of subminimizations of univariate functions, each of which is dependent on only one dictionary atom or the coefficient vector. Although the subproblem with respects to the coefficient vector is still nonsmooth and nonconvex, remarkably, it becomes much simpler and it has a closed-form solution by introducing a novel technique that is log-thresholding operator. The main advantages of the proposed algorithm is that, as suggested by our analysis and simulation study, it is more efficient than state-of-the-art algorithms with different sparsity constraints.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have