Abstract

Convolutional neural networks (CNN) have attracted increasing attention in the field of multimodal cooperation. Recently, the adoption of CNN-based methods has achieved remarkable performance in multisource remote sensing data classification. However, it is still confronted with challenges in the aspect of complementarity extraction. In this paper, the adversarial complementary learning strategy is embedded into the CNN model called ACL-CNN, which is employed to extract the complementary information of the multisource data. The proposed ACL-CNN is able to filter out the common patterns and specific patterns from multisource data by conducting the adversarial max-min game. Especially, the modality-independent common patterns constitute the basic representation of the land-covers, while the specific patterns that are linearly independent of the common patterns that provide the supplementary representation. Therefore, the complementary information is mapped to a compact and discriminative representation. To eliminate the singularity noise, a learnable pattern sampling module (PSM) is designed to extract the mutual-exclusion relationship between specific patterns. Extensive experiments over three datasets demonstrate the superiority of the proposed ACL-CNN compared with several classification technologies.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call