Abstract

In deep learning (DL)-based hyperspectral imagery classification, &#x201C;spatial patching&#x201D; is primarily used as a preprocessing for incorporating local spatial information. This operation can help to promote classification accuracy but it is facing new challenges in the unmanned aerial vehicle (UAV)-borne hyperspectral imagery with high spatial and spectral resolutions (H<sup>2</sup> imagery). The ground objects&#x2019; various spatial scales result in it being challenging to determine the optimal size for the spatial patches. In addition, due to the severe spectral variability and spatial heterogeneity of the H<sup>2</sup> imagery, &#x201C;spatial patching&#x201D; only exploits the local spatial information and results in serious salt-and-pepper (SP) noise and isolated areas in the classification maps. In this article, to address these issues, a novel spectral patching network (SPNet) with an end-to-end DL architecture is proposed for UAV-borne H<sup>2</sup> imagery classification. The &#x201C;spectral patching&#x201D; approach is proposed to preserve the global spatial information and almost all the spectral information of the original hyperspectral imagery. An end-to-end deep encoder&#x2013;decoder network is then constructed based on the spectral patching mechanism, which introduces the deep residual network (ResNet) and atrous spatial pyramid pooling (ASPP) modules to extract multiscale high-level semantic information for the H<sup>2</sup> imagery classification. The experimental results obtained with the Wuhan UAV-borne H<sup>2</sup> imagery (WHU-Hi) UAV-borne hyperspectral data set demonstrate that SPNet can achieve state-of-the-art accuracy and visualization performance in the classification of H<sup>2</sup> imagery.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.