Abstract

Deep learning has been extensively applied in hyperspectral image (HSI) classification for its excellent representation ability. However, the existing training scheme generally adds a supervised classifier to the last layer of the network, so it is difficult to acquire full-scale fine-grained details and coarse-grained semantic information. Moreover, the robust performance of deep learning is commonly supported by numerous samples, so the effective discriminant features cannot be well learned with small class-imbalanced samples. To solve the above problems, a deeply-supervised pseudo learning framework (DSPL) is proposed, in which a deep supervision global learning network with pair-weighted loss is designed to achieve a stronger prediction on small class-imbalanced datasets, while this architecture of deep supervision can facilitate model generalization. To increase the diversity of samples, a semi-supervised learning method with confidence pseudo labels is proposed, capable of screening for more valid unlabeled samples and synthesizing some new mixed samples. To be more specific, the cost loss function consists of the supervised team (i.e., the labeled loss) and semi-supervised consistency regularization team (i.e., the unlabeled loss and the mixed loss), which can significantly enhance the generalization of the network by all useful samples. As revealed by the experimental results, the DSPL is better than other advanced methods on the Indian Pines (highest OA of 99.54% with 5% samples), the Pavia University (highest OA of 99.79% with 0.5% samples), as well as the Houston University 2013 (highest OA of 99.32% with 5% samples).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.