Abstract

In recent years, there has been growing concerns on the study of dictionary learning with the nonconvex sparsity-including penalty. However, how to efficiently address the dictionary learning with the nonconvex penalty is still an open problem. In this paper, we present an efficient DC-based algorithm for dictionary learning with the nonconvex smoothly clipped absolute deviation (SCAD) penalty for strong sparsity and accurate estimation. The optimization problem we considered can be generalized as a minimization of the representation error with the SCAD penalty. The approach we proposed is based on a decomposition scheme which decomposes the whole problem into a set of subproblems with regard to single-vector factors. For handling the nonconvexity of the representation error in the subproblems, we use an alternating optimization scheme to update one factor with the other factor fixed. For tackling the nonconvexity of the SCAD penalty in the subproblems, we apply the Difference of Convex functions (DC) technology to convert the nonconvex subproblem into the resulting convex problems and thus employ DC algorithm to solve the corresponding optimization; thus the simple and straightforward solutions in the closed form can be easily derived. As verified by the numerical experiments with synthetic and real-world data, the proposed algorithm performs better than the state-of-the-art algorithms with different sparsity-including constraints.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call