Abstract

Computerized tomography (CT) is of great significance for the localization and diagnosis of liver cancer. Many scholars have recently applied deep learning methods to segment CT images of liver and liver tumors. Unlike natural images, medical image segmentation is usually more challenging due to its nature. Aiming at the problem of blurry boundaries and complex gradients of liver tumor images, a deep supervision network based on the combination of high-efficiency channel attention and Res-UNet++ (ECA residual UNet++) is proposed for liver CT image segmentation, enabling fully automated end-to-end segmentation of the network. In this paper, the UNet++ structure is selected as the baseline. The residual block feature encoder based on context awareness enhances the feature extraction ability and solves the problem of deep network degradation. The introduction of an efficient attention module combines the depth of the feature map with spatial information to alleviate the uneven sample distribution impact; Use DiceLoss to replace the cross-entropy loss function to optimize network parameters. The liver and liver tumor segmentation accuracy on the LITS dataset was 95.8% and 89.3%, respectively. The results show that compared with other algorithms, the method proposed in this paper achieves a good segmentation performance, which has specific reference significance for computer-assisted diagnosis and treatment to attain fine segmentation of liver and liver tumors.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.