Abstract

Sea–land segmentation is a basic step in coastline extraction and nearshore target detection. Because of poor segmentation accuracy and complicated parameter adjustment, the traditional sea–land segmentation algorithm is difficult to adapt in practical applications. Convolutional neural networks, which can extract multiple hierarchical features of images, can be used as an alternative technical approach for sea–land segmentation tasks. Among them, BiSeNet exhibits good performance in the semantic segmentation of natural scene images and effectively balances segmentation accuracy and speed. However, for the sea–land segmentation of SAR images, BiSeNet cannot extract the contextual semantic and spatial information of SAR images; thus, the segmentation effect is poor. To address the aforementioned problem, this study reduced the number of convolution layers in the spatial path to reduce the loss of spatial information and selected the ResNet18 lightweight model as the backbone network for the context path to reduce the overfitting phenomenon and provide a broad receptive field. At the same time, strategies for edge enhancement and loss function are proposed to improve the segmentation performance of the network in the land and sea boundary region. Experimental results based on GF3 data showed that the proposed method effectively improves the prediction accuracy and segmentation rate of the network. The segmentation accuracy and F1 score of the proposed method are 0.9889 and 0.9915, respectively, and the processing rate of SAR image slices with the resolution of 1024 × 1024 is 12.7 frames/s, which are better than those of other state-of-the-art approaches. Moreover, the size of the network is more than half of that of BiSeNet and smaller than that of U-Net. Thus, the network exhibits strong generalization performance.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.