Abstract

Medical image segmentation has achieved impressive results thanks to U-Net or its alternatives. Yet, most existing methods perform segmentation by classifying individual pixels, tending to ignore the shape-intensity prior information. This may yield implausible segmentation results. Besides, the segmentation performance often drops greatly on unseen datasets. One possible reason is that the model is biased towards texture information, which varies more than shape information across different datasets. In this paper, we introduce a novel Shape-Intensity-Guided U-Net (SIG-UNet) for improving the generalization ability of variants of U-Net in segmenting medical images. Specifically, we adopt the U-Net architecture to reconstruct class-wisely averaged images that only contain the shape-intensity information. We then add an extra similar decoder branch with the reconstruction decoder for segmentation, and apply skip fusion between them. Since the class-wisely averaged image has no any texture information, the reconstruction decoder focuses more on shape and intensity features than the encoder on the original image. Therefore, the final segmentation decoder has less texture bias. Extensive experiments on three segmentation tasks of medical images with different modalities demonstrate that the proposed SIG-UNet achieves promising intra-dataset results while significantly improving the cross-dataset segmentation performance. The source code will be publicly available after acceptance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.