Abstract

Existing few-shot segmentation approaches basically adopt the idea of comparing the semantic prototype vector of the query image and support images, and then obtaining the segmentation result. However, recent studies have shown that a single feature vector in feature map cannot accurately represent pixel-level categories, thus leading to poor segmentation of object boundary and semantic ambiguity. To address this common problem, we propose a novel contour-aware network (CTANet) for few-shot segmentation in this paper. Unlike the usual practice of classifying each pixel separately, CTANet regards all pixels within the same contour as a whole, which can take advantage of the internal consistency of objects to obtain a more accurate representation of category information. To obtain the accurate object contour, our network consists of a contour generation module and a contour refinement module, where the former exploits multiple levels of features to generate a primary contour map and the latter learns to refine the primary contour map. Furthermore, a novel contour-aware mixed loss is proposed to fuse the common BCE loss and our contour-aware loss to supervise the training process on two levels, pixel-level and contour-level. Extensive experiments demonstrate that our CTANet achieves a new state-of-the-art performance on <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><tex-math notation="LaTeX">$ \text{PASCAL-5}^{i}$</tex-math></inline-formula> and <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><tex-math notation="LaTeX">$ \text{COCO-20}^{i}$</tex-math></inline-formula> . Hopefully, our new perspective could provide more clues for future research on few-shot segmentation. Our code is freely available at: <uri xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">https://github.com/hardtogetA/CTANet</uri> .

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.