Abstract
Recently, various fully supervised learning methods are successfully applied for lung CT image segmentation. However, pixel-wise annotations are extremely expert-demanding and labor-intensive, but the performance of unsupervised learning methods are failed to meet the demands of practical applications. To achieve a reasonable trade-off between the performance and label dependency, a novel weakly supervised inpainting-based learning method is introduced, in which only bounding box labels are required for accurate segmentation. Specifically, lesion regions are first detected by an object detection network, then we crop them out of the input image and recover the missing holes to normal regions using a progressive CT inpainting network (PCIN). Finally, a post-processing method is designed to get the accurate segmentation mask from the difference image of input and recovered images. In addition, real information (i.e., number, location and size) of the bounding boxes of lesions from the dataset guides us to make the training dataset for PCIN. We apply a multi-scale supervised strategy to train PCIN for a progressive and stable inpainting. Moreover, to remove the visual artifacts resulted from the invalid features of missing holes, an initial patch generation network (IPGN) is proposed for holes initialization with generated pseudo healthy image patches. Experiments on the public COVID-19 dataset demonstrate that PCIN is outstanding in lung CT images inpainting, and the performance of our proposed weakly supervised method is comparable to fully supervised methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.