Abstract

Few-shot image classification is a challenging topic in pattern recognition and computer vision. Few-shot fine-grained image classification is even more challenging, due to not only the few shots of labelled samples but also the subtle differences to distinguish subcategories in fine-grained images. A recent method called task discrepancy maximisation (TDM) can be embedded into the feature map reconstruction network (FRN) to generate discriminative features, by preserving the appearance details through reconstructing the query image and then assigning higher weights to more discriminative channels, producing the state-of-the-art performance for few-shot fine-grained image classification. However, due to the small inter-class discrepancy in fine-grained images and the small training set in few-shot learning, the training of FRN+TDM can result in excessively flexible boundaries between subcategories and hence overfitting. To resolve this problem, we propose a simple scheme to amplify inter-class discrepancy and thus improve FRN+TDM. To achieve this aim, instead of developing new modules, our scheme only involves two simple amendments to FRN+TDM: relaxing the inter-class score in TDM, and adding a centre loss to FRN. Extensive experiments on five benchmark datasets showcase that, although embarrassingly simple, our scheme is quite effective to improve the performance of few-shot fine-grained image classification. The code is available at https://github.com/Airgods/AFRN.git.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.