Abstract

In this paper, we propose an overcomplete nonnegative dictionary learning method for sparse representation of signals by posing it as a problem of nonnegative matrix factorization (NMF) with a sparsity constraint. By introducing the sparsity constraint, we show that the problem can be cast as two sequential optimal problems of parabolic functions, although the forms of parabolic functions are different from that of the case without the constraint [1,2]. So that the problems can be efficiently solved by generalizing the hierarchical alternating least squares (HALS) algorithm, since the original HALS can work only for the case without the constraint. The convergence of dictionary learning process is fast and the computational cost is low. Numerical experiments show that the algorithm performs better than the nonnegative K-SVD (NN-KSVD) and the other two compared algorithms, and the computational cost is remarkably reduced either.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call