Abstract

High dimensional data in hyperspectral remote sensing leads to computational, analytical, and storage complexities. Dimensionality reduction serves as an efficient tool to remove redundant, irrelevant, and highly correlated features. Recently, deep learning approaches have received remarkable progress in hyperspectral data analysis. In this paper, a new end-to-end deep learning framework based on a teacher-student network inspired by knowledge distillation is proposed for deep feature selection. Initially, a complicated teacher deep neural network is employed on complex high dimensional data to learn its corresponding best low dimensional representation. Then, the knowledge from the network is transferred to a simple student network that performs feature selection. Hence, it eventually leads to deep neural network compression which is of prime concern in hyperspectral remote sensing. Limited studies have been carried out to explore the benefits of knowledge distillation on hyperspectral data. The proposed method could be employed to choose deep features for both supervised and unsupervised tasks. Experimental results reveal the performance of the proposed scheme using limited features. In comparison to 1D and simple autoencoder models, the 2D model based on convolutional autoencoder delivers greater classification accuracies, with a classification accuracy value of 96.15% for the Indian Pines dataset and 97.82% for the Pavia University dataset. A similar trend is reported with unsupervised learning as well. Furthermore, the proposed model has a low degree of sensitivity to parameter selection.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call