Abstract

Few-shot classification recognizes novel categories with limited labeled samples. The classic Relation Network (RN) compares support-query sample pairs for few-shot classification but overlooks support set contextual information, limiting its comparison capabilities. This work reformulates learning the relationship between query samples and each support class as a seq2seq problem. We introduce a Sample-level Transformer-based Relation Network (SLTRN) that utilizes sample-level self-attention to enhance the comparison ability of the relationship module by mining potential relationships among support classes. SLTRN demonstrates comparable performance with state-of-the-art methods on benchmarks, particularly excelling in the 1-shot setting with 52.11% and 67.55% accuracy on miniImageNet and CUB, respectively. Extensive ablation experiments validate the effectiveness and optimal settings of SLTRN. The experimental code for this work is available at https://github.com/ZitZhengWang/SLTRN.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call