Abstract
ABSTRACT In the research of automatic interpretation of remote sensing images, semantic segmentation based on deep convolutional neural networks has been rapidly developed and applied, and the feature segmentation accuracy and network model generalization ability have been gradually improved. However, most of the network designs are mainly oriented to the three visible RGB bands of remote sensing images, aiming to be able to directly borrow the mature natural image semantic segmentation networks and pre-trained models, but simultaneously causing the waste and loss of spectral information in the invisible light bands such as near-infrared (NIR) of remote sensing images. Combining the advantages of multispectral data in distinguishing typical features such as water and vegetation, we propose a novel deep neural network structure called the multispectral semantic segmentation network (MSNet) for semantic segmentation of multi-classified feature scenes. The multispectral remote sensing image bands are split into two groups, visible and invisible, and ResNet-50 is used for feature extraction in both coding stages, and cascaded upsampling is used to recover feature map resolution in the decoding stage, and the multi-scale image features and spectral features from the upsampling process are fused layer by layer using the feature pyramid structure to finally obtain semantic segmentation results. The training and validation results on two publicly available datasets show that MSNet has competitive performance. The code is available: https://github.com/taochx/MSNet.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.