Abstract

Few-shot learning (FSL) is a challenging yet promising technique that aims to discriminate objects based on a few labeled examples. Learning a high-quality feature representation is key with few-shot data, and many existing models attempt to extract general information from the sample or task levels. However, the common sample-level means of feature representation limits the models generalizability to different tasks, while task-level representation may lose class characteristics due to excessive information aggregation. In this article, we synchronize the class-specific and task-shared information from the class and task levels to obtain a better representation. Structure-based contrastive learning is introduced to obtain class-specific representations by increasing the interclass distance. A hierarchical class structure is constructed by clustering semantically similar classes using the idea of granular computing. When guided by a class structure, it is more difficult to distinguish samples in different classes that have similar characteristics than those with large interclass differences. To this end, structure-guided contrastive learning is introduced to study class-specific information. A hierarchical graph neural network is established to transfer task-shared information from coarse to fine. It hierarchically infers the target sample based on all samples in the task and yields a more general representation for FSL classification. Experiments on four benchmark datasets demonstrate the advantages of our model over several state-of-the-art models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call