Abstract
Cross-domain classification with small samples is a more challenging and realistic experimental setup. Until now, few studies have focused on the problem of small-sample cross-domain classification between completely different hyperspectral images (HSIs) since they possess different land cover types and statistical characteristics. To this end, this paper proposes a general-purpose representation learning method for cross-domain HSI classification, aiming to enable the model to learn more general-purpose deep representations that can quickly adapt to different target domains with small samples. The core of this method is to propose a novel three-level distillation strategy to transfer knowledge from multiple models well-trained on source HSIs into a single distilled model at the channel-, feature- and logit-level simultaneously. The learned representations can be further fine-tuned with small samples and quickly adapt to new target HSIs and previously unseen classes. Specifically, to transfer and fuse knowledge from multiple-source domains into a single model simultaneously and solve the inconsistency of the number of bands in different HSIs, an extensible multi-task model, including the channel transformation module, the feature extraction module and the linear classification module, is designed. Only the feature extraction module is shared across different HSIs, while the other two modules are domain-specific. Furthermore, the typical episode-based learning strategy of the metric-based meta-learning is adopted in the whole learning process to further improve the generalization ability and data efficiency. Extensive experiments are conducted on six source HSIs and four target HSIs, and the results demonstrate that the proposed method outperforms the existing advanced methods in cross-domain HSI classification with small samples.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.