Abstract

The timely and accurate land cover mapping with remote sensing images played a huge role in ecosystem monitoring. However, due to the spectral variability and spatial complexity of high-resolution remote sensing images, it is often difficult to find an efficient method to achieve accurate land cover classification. To explore useful contextual and hierarchical features in remote sensing images, this paper proposes a U-shaped object graph neural network (U-OGNN), which is mainly composed of self-adaptive graph construction (SAGC), hierarchical graph encoder, and decoder. For self-adaptive graph construction, the similarity measurement is applied to generate contextual-aware graph structure, by feeding deep features extracted from convolution and multi-layer attention operations. Graph encoder and decoder fuse multi-level information over different scales by capturing hierarchical features of adjacent objects. In this way, the proposed method is able to predict land-cover types by considering multi-level contextual information accurately. Experiments on GID land-cover classification datasets, the overall accuracies of the U-OGNN reach 87.81 %.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.