Abstract

The segmentation of multiple sclerosis (MS) lesions from MR imaging sequences remains a challenging task, due to the characteristics of variant shapes, scattered distributions and unknown numbers of lesions. However, the current automated MS segmentation methods with deep learning models face the challenges of (1) capturing the scattered lesions in multiple regions and (2) delineating the global contour of variant lesions. To address these challenges, in this paper, we propose a novel attention and graph-driven network (DAG-Net), which incorporates (1) the spatial correlations for embracing the lesions in distant regions and (2) the global context for better representing lesions of variant features in a unified architecture. Firstly, the novel local attention coherence mechanism is designed to construct dynamic and expansible graphs for the spatial correlations between pixels and their proximities. Secondly, the proposed spatial-channel attention module enhances features to optimize the global contour delineation, by aggregating relevant features. Moreover, with the dynamic graphs, the learning process of the DAG-Net is interpretable, which in turns support the reliability of segmentation results. Extensive experiments were conducted on a public ISBI2015 dataset and an in-house dataset in comparison to state-of-the-art methods, based on geometrical and clinical metrics. The experimental results validate the effectiveness of proposed DAG-Net on segmenting variant and scatted lesions in multiple regions.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.