Abstract

Recently, deep learning has been widely used in hyperspectral images (HSIs) classification to extract spectral-spatial information. However, how to fuse and utilize spectral-spatial features more efficiently is a challenging task. To address this issue, we propose a spectral-spatial feature fusion via dual-stream deep architecture (SSDS) for HSIs classification in this letter that integrates two core modules of dual-stream and feature fusion. In detail, one stream applies a bi-direction gated recurrent unit (Bi-GRU) to extract spectral features at the pixel level, and another takes the corresponding image patch as an input for a shallow multi-scale parallel 2D convolutional neural network (CNN) to extract multi-scale spatial features. Subsequently, the spectral and spatial features of HSIs are fused by feature fusion module that consists of fusion-wise pooling and fully connected layers to learn discriminative and useful features adaptively. Moreover, a spectral attention module is designed to further extract correlation among spectral bands and reduce interference from irrelevant features. Experimental results on three HSIs datasets, i.e., Pavia University, Salinas, and Kennedy Space Center, demonstrate the effectiveness of the proposed SSDS method. The overall accuracy of SSDS on such three HSIs datasets is 99.22%, 97.99%, and 94.35%, respectively, which increases at least 0.4%, 0.33%, and 0.36% compared with other state-of-the-art methods. In addition, SSDS is less time-consuming than most baselines. For instance, the training time on Salinas data decreases from 1226.7 min (3DCNN) to 31.6 min.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.