Abstract

The problem of image segmentation is one of the most significant ones in computer vision. Recently, deep-learning methods have dominated state-of-the-art solutions that automatically or interactively divide an image into subregions. However, the limitation of deep-learning approaches is that they require a substantial amount of training data, which is costly to prepare. An alternative solution is semi-supervised image segmentation. It requires rough denotations to define constraints that are next generalized to precisely delimit relevant image regions without using train examples. Among semi-supervised strategies for image segmentation, the leading are graph-based techniques that define image segmentation as a result of pixel or region affinity graph partitioning. This paper revisits the problem of graph-based image segmentation. It approaches the problem as semi-supervised node classification in the SLIC superpixels region adjacency graph using a graph convolutional network (GCN). The performance of both spectral and spatial graph convolution operators is considered, represented by Chebyshev convolution operator and GraphSAGE respectively. The results of the proposed method applied to binary and multi-label segmentation are presented, numerically assessed, and analyzed. In its best variant, the proposed method scored the average DICE of 0.86 in the binary segmentation task and 0.79 in the multi-label segmentation task. Comparison with state-of-the-art graph-based methods, including Random Walker and GrabCut, shows that graph convolutional networks can represent an attractive alternative to the existing solutions to graph-based semi-supervised image segmentation.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.