Abstract
We propose a new algorithm for the design of overcomplete dictionaries for sparse coding, neural gas for dictionary learning (NGDL), which uses a set of solutions for the sparse coefficients in each update step of the dictionary. In order to obtain such a set of solutions, we additionally propose the bag of pursuits (BOP) method for sparse approximation. Using BOP in order to determine the coefficients of the dictionary, we show in an image encoding experiment that in case of limited training data and limited computation time the NGDL update of the dictionary performs better than the standard gradient approach that is used for instance in the Sparsenet algorithm, or other state-of-the-art methods for dictionary learning such as the method of optimal directions (MOD) or the widely used K-SVD algorithm. In an application to image reconstruction, dictionaries trained with this algorithm outperform not only overcomplete Haar-wavelets and overcomplete discrete cosine transformations, but also dictionaries obtained with widely used algorithms like K-SVD.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.