Abstract

The state-of-the-art deep learning methods have demonstrated impressive performance in segmentation tasks. However, the success of these methods depends on a large amount of manually labeled masks, which are expensive and time-consuming to be collected. In this work, a novel consistent perception generative adversarial network (CPGAN) is proposed for semi-supervised stroke lesion segmentation. The proposed CPGAN can reduce the reliance on fully labeled samples. Specifically, a similarity connection module (SCM) is designed to capture the information of multi-scale features. The proposed SCM can selectively aggregate the features at each position by a weighted sum. Moreover, a consistent perception strategy is introduced into the proposed model to enhance the effect of brain stroke lesion prediction for the unlabeled data. Furthermore, an assistant network is constructed to encourage the discriminator to learn meaningful feature representations which are often forgotten during training stage. The assistant network and the discriminator are employed to jointly decide whether the segmentation results are real or fake. The CPGAN was evaluated on the Anatomical Tracings of Lesions After Stroke (ATLAS). The experimental results demonstrate that the proposed network achieves superior segmentation performance. In semi-supervised segmentation task, the proposed CPGAN using only two-fifths of labeled samples outperforms some approaches using full labeled samples.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.