Abstract

Hyperspectral image classification (HSIC) has been a significant topic in the field of remote sensing in the past few years. Convolutional neural networks have shown promising performance in HSIC applications due to their strong local feature extraction ability. However, they struggle to extract global information from HSIs, thereby resulting in classification performance limitations. Recently, vision transformers have been used to solve HSIC problems, and its advantage is to adopt the multi-head self-attention mechanism to explore global dependencies. Nevertheless, the extracted features using MHSA usually exhibit over-dispersion due to the abundance of band information hidden in HSIs. In this work, we propose a novel method, called dual attention transformer network (DATN), for HSIC problems. It consists of two types of modules, namely the spatial–spectral hybrid transformer (SSHT) module and the spectral local-conv block (SLCB) module. Specifically, the SSHT module aims to utilize the MHSA to capture spatial and spectral feature information. Therefore, it can effectively utilize global spatial–spectral features and embed the local spatial information, simultaneously. Besides, we design a SLCB module to extract the local spectral information of HSIs effectively. Then the SSHT and SLCB modules are integrated into an end-to-end framework. Finally, the global and local spatial–spectral features extracted from this framework are input into the fully connected layer, and then classification results of HSIs are obtained. A series of experiments on three HSI datasets have demonstrated that our DATN approach outperforms several state-of-the-art HSIC approaches.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.