Abstract

The timely and accurate land cover mapping with remote sensing images played a huge role in ecosystem monitoring. However, due to the spectral variability and spatial complexity of high-resolution remote sensing images, it is often difficult to find an efficient method to achieve accurate land cover classification. To explore useful contextual and hierarchical features in remote sensing images, this paper proposes a U-shaped object graph neural network (U-OGNN), which is mainly composed of self-adaptive graph construction (SAGC), hierarchical graph encoder, and decoder. For self-adaptive graph construction, the similarity measurement is applied to generate contextual-aware graph structure, by feeding deep features extracted from convolution and multi-layer attention operations. Graph encoder and decoder fuse multi-level information over different scales by capturing hierarchical features of adjacent objects. In this way, the proposed method is able to predict land-cover types by considering multi-level contextual information accurately. Experiments on GID land-cover classification datasets, the overall accuracies of the U-OGNN reach 87.81 %.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call