Abstract

Recently, convolutional neural networks have greatly improved the performance of hyperspectral image (HSI) classification. However, these methods mainly use local spatial–spectral information for the HSI classification and require a large number of labeled samples to ensure high classification accuracy. In our study, we propose a multiscale nested U-Net (MsNU-Net) to capture the global context information and improve the HSI classification accuracy with a small number of labeled samples. We took an HSI as the input and constructed a nested U-Net to complete the classification. Because scale is very important for image recognition, we propose a simple but effective multiscale loss function. Apart from introducing multiscale features into the network, this method uses Gaussian filters to construct multiscale data, inputs the multiscale data into the nested U-Net with shared parameters, and calculates the sum of loss functions of different scales as the final loss function. Furthermore, it introduces different scales of global context information, thus improving the classification accuracy. To demonstrate its effectiveness, we carried out classification experiments on four widely used HSIs. The results show that this method could achieve a higher classification accuracy than the compared methods when only a small number of labeled samples is available. Furthermore, the codes of the proposed method will be made available freely at https://github.com/liubing220524/MsNU-Net.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.