Abstract

Recently, deep learning has been reported to be an effective method for improving hyperspectral image classification and convolutional neural networks (CNNs) are, in particular, gaining more and more attention in this field. CNNs provide automatic approaches that can learn more abstract features of hyperspectral images from spectral, spatial, or spectral-spatial domains. However, CNN applications are focused on learning features directly from image data—while the intrinsic relations between original features, which may provide more information for classification, are not fully considered. In order to make full use of the relations between hyperspectral features and to explore more objective features for improving classification accuracy, we proposed feature relations map learning (FRML) in this paper. FRML can automatically enhance the separability of different objects in an image, using a segmented feature relations map (SFRM) that reflects the relations between spectral features through a normalized difference index (NDI), and it can then learn new features from SFRM using a CNN-based feature extractor. Finally, based on these features, a classifier was designed for the classification. With FRML, our experimental results from four popular hyperspectral datasets indicate that the proposed method can achieve more representative and objective features to improve classification accuracy, outperforming classifications using the comparative methods.

Highlights

  • As the spectral resolution of remote sensing (RS) sensors has improved, hyperspectral technology has exhibited great potential for obtaining land use information with fine quality

  • For the Pavia University (PU) and SA datasets, the eimdargane.dom walkers (ERW) method had a higher accuracy than feature relations map learning (FRML), the FRML method seemed to require a larger number of training samples to achieve an accuracy that would be equal to or similar to the accuracy achieved by ERW

  • By establishing relations between different spectral features using normalized difference index (NDI), it was found that each class has its own feature relations map (FRM) that could be used to distinguish it from other classes

Read more

Summary

Introduction

As the spectral resolution of remote sensing (RS) sensors has improved, hyperspectral technology has exhibited great potential for obtaining land use information with fine quality. Hyperspectral RS images capture the spectrum of every pixel within observed scenes at hundreds of continuous and narrow bands. In comparison with multispectral images, which have wide wavelength, hyperspectral images can provide features hidden in narrow wavelengths in order to distinguish objects that are difficult to detect [1,2]. Hyperspectral RS image classification is an important process for transforming hyperspectral information from the ground’s surface into attribute information. It is an extension of the conventional multiple spectral RS image classification, which aims at assigning a pixel to a unique class [6,7]. Hyperspectral images differ significantly from multiple spectral images because they have high-dimensional features and the correlation between adjacent bands is often high. Along with other aspects, such as noise and mixed pixels, hyperspectral image classification suffers from data redundancy, dimensional disaster, and uncertainty, making this type of classification more complex and challenging [8,9]

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.