Abstract
In deep dictionary learning multiple dictionaries are learned based on information at various levels of abstraction. We propose a novel hierarchical discriminative dictionary learning layer embedded within a neural network with an image classification objective. Discrimination is induced in the learned synthesis dictionaries at multiple hierarchical levels in a simple way by a one-hot-code representation of the class labels during the training backward pass. In addition, local sparse representation objectives are approximated during the forward pass, introducing local regularization. We evaluate our proposal on five known datasets and we either outperform state-of-the-art methods or achieve competitive classification results.
Highlights
Dictionary learning aims to represent some data matrix by a linear combination of an underlying basis where both the basis and the coefficients of the linear combination are learned
Our results reassure the benefits of the convolutional modules in our proposal, which are further improved by our hierarchical discriminative dictionary learning (HDDL) layers as shown in our comparison against convolutional neural networks in Table 2 (ORL database) and other state-of-the-art methods
The learned dictionaries end up tuned both for the final classification objective and for the local representation objective implicit at each HDDL layer
Summary
Dictionary learning aims to represent some data matrix by a linear combination of an underlying basis where both the basis (dictionary) and the coefficients of the linear combination are learned. The sparse representation-based classification (SRC) [3] method learns sparse coefficients using the training data matrix as a fixed dictionary structured by class, and where such coefficients are used in a self contained classifier. This avoids the computational cost of learning a dictionary and provides good classification results as long as the training data is representative of the actual test data The previous methods are competitive, they do not achieve state-of-the-art classification results, but interestingly the learned representations do not require data labels for training since they are learned in an unsupervised way. In [20] sparse coding layers are proposed with wide and slim dictionaries to induce discriminative features and clustered representations respectively, achieving competitive results in image classification.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.