Abstract
Training large neural network models from scratch is not feasible due to over-fitting on small datasets and too much time consumed on large datasets. To address this, transfer learning, namely utilizing the feature extracting capacity learned by large models, becomes a hot spot in neural network community. At the classifying stage of pre-trained neural network model, either a linear SVM classifier or a Softmax classifier is employed and that is the only trained part of the whole model. In this paper, inspired by transfer learning, we propose a classifier based on conceptors called Multi-label Conceptor Classifier (MCC) to deal with multi-label classification in pre-trained neural networks. When no multi-label sample exists, MCC equates to Fast Conceptor Classifier, a fast single-label classifier proposed in our previous work, thus being applicable to single-label classification. Moreover, by introducing a random search algorithm, we further improve the performance of MCC on single-label datasets Caltech-101 and Caltech-256, where it achieves state-of-the-art results. Also, its evaluations with pre-trained rather than fine-tuning neural networks are investigated on multi-label dataset PASCAL VOC-2007, where it achieves comparable results.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.