Abstract

Dictionary learning and deep learning can be combined to boost the performance of classification tasks. However, existing combined methods often learn multi-level dictionaries each of which is embedded in a network layer, involve a large number of parameters (elements of many dictionaries) and thus easily result in prohibitive computational cost and even overfitting. In this paper, we present a novel deep Auto-Encoder based Structured Dictionary (AESD) learning model, where we need to learn only one dictionary which is composed of class-specific sub-dictionaries, and supervision is introduced by imposing discriminative category constraints to empower the dictionary with discrimination. The encoding layers are designed with shared parameters which are exactly dependent on the dictionary carried by the decoding layer. This characterizes the learning process by forward-propagation based optimization w.r.t the dictionary only, leading to a light-weight network training. In addition to utilizing directly the trained encoding network combined with a minimum-reconstruction-residual scheme for single image based classification, to expand the application spectrum of our method, in the testing phase, we extend the proposed prototype into a Convolutional Encoder based Block Sparse Representation (CEBSR) model to promote the latent block sparsity in the joint representation of an image set, achieving improved image set based classification. Extensive experiments verify the performance of the learned dictionary for image classification, and the superiority of our extended model over the state-of-the-art image set classification methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.