Abstract
Single-stage instance segmentation approaches have been attracting increasing attention in research communities because of faster speed, more straightforward design, and competitive accuracy than two-stage methods. However, the currently available methods only generate pixel-level classification , which ignores the information of shapes and object boundaries and thus results in wild and fuzzy mask predictions and inaccurate localization . To address these issues, we propose BCondInst, an effective boundary-preserving conditional convolution network aiming at adopting object boundary information to improve the accuracy of mask localization. Specifically, BCondInst replaces the original mask branch with a boundary-mask branch containing two sub-networks designed to jointly learn object boundaries and masks. Different from the existing segmentation approaches, in which the parameters of filters in the boundary head remain unchanged for all instances, the filters in the mask’s head and boundary’s head in BCondInst are created according to specific instances. Furthermore, we design a post-processing mechanism that adopts the predicted boundaries to further improve the quality of predicted masks in the testing phase. Consequently, masks predicted by BCondInst generally preserve more details and more accurate boundaries. Extensive experimental results on Microsoft COCO 2017 benchmark indicate that the performance of the proposed BCondInst exceeds the current popular one-stage instance segmentation approaches by a large margin.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.