Abstract

Abstract Recently, classification and dimensionality reduction (DR) have become important issues of hyperspectral image (HSI) analysis. Especially, HSI classification is a challenging task due to the high-dimensional feature space, with a large number of spectral bands, and a low number of labeled samples. In this paper, we propose a new HSI classification approach, which is called fused 3-D spectral-spatial deep neural networks for hyperspectral image classification. We propose an unsupervised band selection method to avoid the problem of redundancy between spectral bands and automatically find a set of groups Ck each one containing similar spectral bands. Moreover, the model uses the different groups of selected bands to extract spectral-spatial features in order to improve the classification rate. Each group is associated with a 3-D CNN model, which are then fused to improve the precision of classification. The main advantage of the proposed method is to keep the initial spectral-spatial features by automatically selecting relevant spectral bands, which improves the classification of HSI using a low number of labeled samples. Experiments on two real HSIs, Indian Pines and Salinas datasets, are performed to demonstrate the effectiveness of the proposed method. Results show that the proposed method reaches competitive good performances, and achieves better classification rates compared to various state-of-the-art techniques.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.