Abstract
ABSTRACT Thanks to the relevant spectral-spatial information, hyperspectral images (HSIs) have been widely exploited in Earth observation. Recently, graph convolutional networks (GCNs) have attracted increasing attention in HSI classification due to their advantages in processing non-Euclidean structure data. Unlike convolutional neural networks (CNNs), which perform convolution operations on regular square regions, GCNs can directly work on graph structure data to extract the relationships among adjacent land covers. However, extracting meaningful and deep discriminative spectral-spatial features from HSIs is still a challenging task. In this article, a novel multi-scale feature learning via residual dynamic graph convolutional network is designed for HSI classification, which can extract large-scale contextual spatial structures at superpixel-level graph and local spectral-spatial information at pixel-level, significantly improving the performance of HSI classification. Unlike from the existing GCN-based methods that operate on a graph with a fixed neighbourhood size, multiple graphs with diverse neighbourhood scales are built to comprehensively leverage spectral-spatial information and relationship at multiple scales, and these graphs are dynamically updated to generate more discriminative features (via dynamic GCN) during the convolution process. Moreover, to fully use the multi-scale features extracted from HSIs, a multi-scale feature fusion module is developed to emphasize important features and suppress irrelevant ones. Extensive experiments carried on three benchmark data sets demonstrate the superiority of the proposed approach over other state-of-the-art methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.