Abstract

Hyperspectral image (HSI) classification is a challenging task due to subtle interclass difference and large intraclass variability, especially when the available training samples are scarce. To overcome this barrier, this article proposes a novel deep similarity network (DSN) for HSI classification, which not only ensures enough samples for training but also extracts more discriminative features. Unlike other classification methods, our essential idea is to approach the classification task by learning a new similarity measure of pixel pairs under a two-branch neural network. Specifically, a binary classification dataset with same-class and different-class pixel pairs is first constructed, which can significantly increase the number of training samples. Then, the DSN utilizes two subnetworks to extract deep features from the pixel pairs, and computes the similarity between the extracted deep features by a fusion subnetwork. Finally, the output of the DSN is used to measure the similarity to each class and the similarity determines the class label. To make full use of the spatial information, the extended multiattribute profile is incorporated to the DSN. Moreover, a joint loss function is proposed to enhance the discrimination and alleviate the challenge caused by the spatial variability of spectral signatures. Experiments on real HSI datasets verify the superiority of the DSN over several state-of-the-art methods in HSI classification. For instance, the overall accuracy of the DSN on Houston2013 dataset is 89.07%, which achieves a marked improvement of at least 4.2% over all compared methods like convolutional neural network, deep learning with attribute profiles and so on.

Highlights

  • H YPERSPECTRAL image (HSI) classification has achieved great success in the field of remote sensing and played a significant role in a wide variety of applications, such as urban planning, environmental monitoring, and precision agriculture [1]–[5]

  • The class “Oats” only has dozens of labeled samples in the Indian Pines dataset, while the largest class “Soybean-mintill” contains thousands of ones. This affects the performance of conventional deep learning-based HSI classification methods, in which neural network classifiers are directly trained for multiclass classification

  • We propose the deep similarity network (DSN) to learn a similarity measure, which can be used to differentiate whether two pixels of a pixel pair are similar or not

Read more

Summary

INTRODUCTION

H YPERSPECTRAL image (HSI) classification has achieved great success in the field of remote sensing and played a significant role in a wide variety of applications, such as urban planning, environmental monitoring, and precision agriculture [1]–[5]. In [41], Li et al proposed a pixel-pair model to ensure a sufficient amount of input data to learn parameters in CNN These methods resort to augment the number of available training samples to ease the training of deep network without changing the conventional work mode, i.e., inputting a sample and outputting its predicted label. Some HSI classification methods proposed to adopt ideas from metric learning [45]–[47] They use deep neural networks to extract spectral features or spectral-spatial features followed by a distance metric. The class “Oats” only has dozens of labeled samples in the Indian Pines dataset, while the largest class “Soybean-mintill” contains thousands of ones This affects the performance of conventional deep learning-based HSI classification methods, in which neural network classifiers are directly trained for multiclass classification.

DSN-BASED HSI CLASSIFICATION FRAMEWORK
Binary Classification Dataset Construction
Deep Similarity Network
Classification With Similarity Scores
DSN-Based HSI Classification Framework
Analysis of DSN
Data Sets Description and Quantitative Metrics
Experimental Settings
Ablation Experiment
Comparison of Different Methods
CONCLUSION

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.