Abstract

Few-shot learning (FSL) methods have attracted great attention in recent years. However, under the cross-domain few-shot learning (CDFSL) setting, where domain gaps exist between the source and target domains, many FSL methods face performance degradation. One possible reason is the well-trained model can easily learn discriminative features on the source domain but may fail on target domains. To alleviate this problem, it is desirable to learn a model that can extract useful general features from target tasks regardless of domain gaps during the source domain training. In this paper, we propose a dynamic representation enhancement (DRE) framework that maximizes the model’s representation ability to learn diverse and meaningful features on unknown domains. First, we proposed a stage-wise cycling and warm-up feature activation training strategy to make the model continuously explore useful general features during training, instead of just satisfying with fitting well on the limited training data. Second, we designed a task-adaptive loss function to enhance the learning of diverse features by making the model focus on inter-class discriminative features during training. Experimental analysis shows that the proposed DRE framework effectively extracts informative features across various target domains. On the popular cross-domain BSCD-FSL benchmark, the performance of the DRE framework is competitive with the state-of-the-art methods with an average classification accuracy of 51.01%, 66.71% and 74.13% in the 5-way 1-shot, 5-way 5-shot and 5-way 20-shot settings, respectively. Our code is publicly available at https://github.com/ideal-123/Dynamic-Representation-Enhancement-framework.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.