Abstract
In the past few years, many convolutional neural networks (CNNs) have been applied to hyperspectral image (HSI) classification. However, many of them have the following drawbacks: they do not fully consider the abundant band spectral information and insufficiently extract the spatial information of HSI; all bands and neighboring pixels are treated equally, so CNNs may learn features from redundant or useless bands/pixels; and a significant amount of hidden semantic information is lost when a single-scale convolution kernel is used in CNNs. To alleviate these problems, we propose a spatial–spectral split attention residual networks (S <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><tex-math notation="LaTeX">$^{3}$</tex-math></inline-formula> ARN) for HSI classification. In S <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><tex-math notation="LaTeX">$^{3}$</tex-math></inline-formula> ARN, a split attention strategy is used to fuse the features extracted from multireceptive fields, in which both spectral and spatial split attention modules are composed of bottleneck residual blocks. Thanks to the bottleneck structure, the proposed method can effectively prevent overfitting, speeds up the model training, and reduces the network parameters. Moreover, the spectral and spatial attention residual branches aim to generate the attention masks, which can simultaneously emphasize useful bands and neighbor pixels and suppress useless ones. Experimental results on three benchmark datasets demonstrate the effectiveness of the proposed model for HSI classification.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.