Abstract

Since the number of superpixels is lower than that of pixels, superpixels can substantially speed up subsequent processing steps and have been widely used in synthetic aperture radar (SAR) image segmentation. However, in most of the existing superpixel-wise segmentation algorithms, superpixel prediction is an isolated preprocessing step and is independent of the segmentation task. The performance of the segmentation results is determined by the accuracy of superpixels. Once superpixels are generated, their shape cannot be changed in the following segmentation stage, even if the same superpixels contain pixels of different landcovers. To address this, we propose an end-to-end trainable superpixel-wise segmentation method for single-polarization SAR images. First, we design a differentiable boundary-ware clustering method for estimating task-specific superpixels. Instead of the hard association between pixels and superpixels in the existing superpixel algorithms, this method introduces the soft association map to make the clustering differentiable. Hence, it can be implemented using a simple deep fully convolutional network. In the segmentation part, we propose a novel soft graph convolution network (Soft-GCN), which takes the association map as input and performs superpixel-wise segmentation. The advantage of our method is that superpixel generation and graph convolution parts can be trained under a unified framework, until two parts obtain the optimum parameters. In the training process, it can adaptively adjust the shape of the superpixels according to the segmentation results, ensuring the superpixels correctly adhere the boundaries. Experimental results with simulated and real SAR images demonstrate that our method outperforms other state-of-the-art segmentation algorithms, while also being faster.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.