Abstract

Magnetic resonance imaging (MRI) plays a crucial role in the diagnosis of ischemic stroke. Accurate segmentation of the infarct is of great significance for selecting intervention treatment methods and evaluating the prognosis of patients. To address the issue of poor segmentation accuracy of existing methods for multiscale stroke lesions, a novel encoder-decoder architecture network based on depthwise separable convolution is proposed. Firstly, this network replaces the convolutional layer modules of the U-Net with redesigned depthwise separable convolution modules. Secondly, an modified Atrous spatial pyramid pooling (MASPP) is introduced to enlarge the receptive field and enhance the extraction of multiscale features. Thirdly, an attention gate (AG) structure is incorporated at the skip connections of the network to further enhance the segmentation accuracy of multiscale targets. Finally, Experimental evaluations are conducted using the ischemic stroke lesion segmentation 2022 challenge (ISLES2022) dataset. The proposed algorithm in this paper achieves Dice similarity coefficient (DSC), Hausdorff distance (HD), sensitivity (SEN), and precision (PRE) scores of 0.816 5, 3.668 1, 0.889 2, and 0.894 6, respectively, outperforming other mainstream segmentation algorithms. The experimental results demonstrate that the method in this paper effectively improves the segmentation of infarct lesions, and is expected to provide a reliable support for clinical diagnosis and treatment.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.