Abstract

Recently, prototypical network based few-shot learning (FSL) has been introduced for small-sample hyperspectral image (HSI) classification and shown good performance. However, existing prototypical-based FSL methods have two problems: prototype instability and domain shift between training and testing datasets. To solve these problems, we propose a refined prototypical contrastive learning network for few-shot learning (RPCL-FSL) in this paper, which incorporates supervised contrastive learning and FSL into an end-to-end network to perform small-sample HSI classification. To stabilize and refine the prototypes, RPCL-FSL imposes triple constraints on prototypes of the support set, i.e., contrastive learning (CL), self-calibration (SC) and cross-calibration (CC) based constraints. The CL module imposes internal constraint on the prototypes aiming to directly improve the prototypes using support set samples in the CL framework, and the SC and CC modules impose external constraints on the prototypes by using the prediction loss of support set samples and the query set prototypes, respectively. To alleviate domain shift in the FSL, a fusion training strategy is designed to reduce the feature differences between training and testing datasets. Experimental results on three HSI datasets demonstrate that the proposed RPCL-FSL outperforms existing state-of-the-art deep learning and FSL methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call