Abstract
High spatial and spectral resolution (H2) imagery collected by unmanned aerial vehicle (UAV) systems is an important data source for precise crop classification. Although this data source can provide us with abundant information about the crops of interest, it also introduces new challenges for the image processing. Specifically, the spectral similarities of green crops lead to small inter-class distances, and the severe intra-class spectral variability and high spatial heterogeneity in H2 imagery increases the difficulty of precise classification. In addition, the scales of the different crop plots can show great differences, which makes it difficult to determine the optimal patch size for deep learning based classification models. In this paper, a spectral-spatial-scale attention network (S3ANet) is proposed for H2 imagery based precise crop classification. In the proposed method, each channel, each pixel, and each scale perception of the feature map is adaptively weighted to relieve the intra-class spectral variability, the spatial heterogeneity, and the scale difference of the crop plots, respectively. Furthermore, the proposed S3ANet method introduces the additive angular margin loss function to further increase the inter-class distances between the different crops, and reduce the misclassification effect. S3ANet was verified using the public WHU-Hi UAV-borne hyperspectral dataset and the new WHU-Hi-JiaYu dataset, which is a dataset for precise rice classification that was built by the authors. In these experiments, the overall accuracy of the proposed S3ANet method all exceeds 96% under 50 training pixels per class, and it achieved significant improvement compared with some state-of-the-art hyperspectral images classifiers (such as SSRN, CNNCRF and FPGA, etc.). The code of S3ANet is available at http://rsidea.whu.edu.cn/resource_sharing.htm.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: ISPRS Journal of Photogrammetry and Remote Sensing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.