Abstract

Existing land cover classification methods mostly rely on either the optical or synthetic aperture radar (SAR) features alone, which ignore the mutual complementary effects between optical and SAR sources. In this article, we compare the distribution histograms of deep semantic features extracted from optical and SAR modalities within land cover categories, which intuitively demonstrates that there are the large complementary potentials between the optical and SAR features. Therefore, we propose a novel collaborative attention-based heterogeneous gated fusion network (CHGFNet), which hierarchically fuses both optical and SAR features for land cover classification. More specifically, the CHGFNet consists of three main components: two-stream feature extractor, multimodal collaborative attention module (MCAM), and the gated heterogeneous fusion module (GHFM). Given optical and SAR patch pairs, two-stream feature extractor introduces multistage feature learning methodology to acquire discriminative optical and SAR features. Then, to explore the inherent complementarity between optical and SAR features, MCAM is embedded into CHGFNet, which provides an efficient stage to capture the correlation between optical and SAR features by jointly calculating the collaborative attention in joint feature space. Finally, to automatically learn the varying contributions of both optical and SAR features for classifying different land categories, GHFM is used to fuse both optical and SAR features. Extensive comparative evaluations demonstrate the advantages of CHGFNet within land cover classification over the state-of-the-art methods on three co-registered optical and SAR data sets.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.