Abstract
Liver segmentation is critical for the location and diagnosis of liver cancer. The variant of U-Net network with skip connections has become popular in the medical image segmentation field. However, these variant networks tend to fuse semantically dissimilar feature maps via simple skip connections between encoder and decoder path. We argue that the network learning task would be handled easily when the feature maps from the encoder-decoder path are semantically similar. The fusion of semantically dissimilar feature maps can cause gaps between feature maps. Hence, the proposed method in this paper is to obtain semantically similar feature maps, alleviate the semantic gaps caused by simple skip connections, and improve segmentation accuracy. In this paper, we proposed a new U-Net architecture named Multi-Scale Nested U-Net (MSN-Net). The MSN-Net consists of Res-block and MSCF-block. The Res-block with the bottleneck layer is used to make the network deeper and avoid gradient disappearance. To alleviate the semantic gaps, we redesign a novel skip connection. The novel skip connection consists of MSCF-block and dense connections. The MSCF-block combines High-level and Low-level features and Multi-scale semantic information to obtain more representative features. The densely connections are adopted between MSCF-blocks. In addition, we use a weighted loss function which consists of cross-entropy loss and Dice loss. The proposed method is evaluated on the dataset of MICCAI 2017 LiTS Challenge. The results of experiment demonstrate that, MSN-Net can effectively alleviate the semantic gaps and outperform other state-of-the-art methods. The method proposed with the novel skip connections can effectively alleviate the semantic gaps between encoder and decoder path and improve segmentation accuracy of the network.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.