Abstract

Few-shot image classification has attracted extensive attention, which aims to recognize unseen classes given only a few labeled samples. Due to the large intraclass variances and interclass similarity of remote sensing scenes, the task under such circumstance is much more challenging than general few-shot image classification. Most existing prototype-based few-shot algorithms usually calculate prototypes directly from support samples and ignore the validity of prototypes, which results in a decline in the accuracy of subsequent inferences based on prototypes. To tackle this problem, we propose a Siamese-prototype network (SPNet) with prototype self-calibration (SC) and intercalibration (IC). First, to acquire more accurate prototypes, we utilize the supervision information from support labels to calibrate the prototypes generated from support features. This process is called SC. Second, we propose to consider the confidence scores of the query samples as another type of prototypes, which are then used to predict the support samples in the same way. Thus, the information interaction between support and query samples is implicitly a further calibration for prototypes (so-called IC). Our model is optimized with three losses, of which two additional losses help the model to learn more representative prototypes and make more accurate predictions. With no additional parameters to be learned, our model is very lightweight and convenient to employ. The experiments on three public remote sensing image datasets demonstrate competitive performance compared with other advanced few-shot image classification approaches. The source code is available at <uri>https://github.com/zoraup/SPNet</uri>.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.