Abstract
ABSTRACT The advancement of ultra-hyperspectral imaging technology, exemplified by the AisaIBIS sensor, has enabled a leap from hyperspectral data (hundreds of bands) to ultra-hyperspectral data (thousands of bands). It provides immense potential for precise ground object recognition within intricate scenes. However, the complexities inherent to features of the ground objects, coupled with the copious redundant information within the ultra-hyperspectral data, pose substantial challenges for accurate object recognition. Therefore, this paper proposed a comprehensive framework to explore the optimal precise classification strategy of ultra-hyperspectral data in complex scenes (12 vegetation and non-vegetation classes). (a) Our investigation delves into the influence of diverse feature subsets and a range of machine learning classifiers on the precision of ground objects recognition. The proposed strategy is up to an overall accuracy of 88.44%, effectively avoiding the curse of dimension, and significantly enhancing the capability to recognize the complex ground objects. (b) Furthermore, based on the simulation of hyperspectral images with different spectral resolutions, we compared the classification results of ultra-hyperspectral data (0.11 nm) and the hyperspectral datasets (10 nm, 5 nm, and 1 nm) by machine learning methods. Compared with the hyperspectral datasets, ultra-hyperspectral data improved the classification accuracy by 5.30–6.38%. This substantiates the pronounced advantages of ultra-hyperspectral data in precision land cover classification. This study provides a valuable reference for the application of ultra-hyperspectral data in recognition of complex ground objects scenes, and urban accurate monitoring.
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.