Abstract

The graph-based hyperspectral image classification (HSIC) method has attracted wide attention because it can extract information with a non-Euclidean structure. Many graph-based HSIC works have achieved good results, but unresolved technical issues remain. For example, many graph nodes lead to high computational costs, and the mining of non-Euclidean structures is not sufficient. To solve these problems, we propose a graph attention network with an adaptive graph structure mining (GAT-AGSM) approach. Specifically, we first propose an HSIC framework with a superpixel feature subdivision (SFS) mechanism. In this framework, the number of nodes in the graph structure is reduced by using superpixel segmentation algorithms, and the SFS mechanism is designed to generate finer classification results. Second, we design the spatial–spectral attention layer with an adaptive graph structure mining (AGSM) mechanism for the graph attention network. The spatial–spectral attention layer can filter information in both spatial and spectral dimensions. The AGSM mechanism requires less manual intervention to dynamically generate non-Euclidean graph structures that better aggregate information. We conduct excessive experiments to compare the proposed GAT-AGSM with seven nongraph methods and three graph-based methods on widely used datasets. On the Indian Pines, Pavia University, and Salinas datasets, compared to the comparison method, the overall accuracy of GAT-AGSM is improved by at least 4.26%, 2.59%, and 1.41%, respectively. Experimental results show that GAT-AGSM has the best performance compared to the baselines in terms of various metrics.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.