Abstract
ObjectiveAccurate nidus segmentation and quantification have long been challenging but important tasks in the clinical management of Cerebral Arteriovenous Malformation (CAVM). However, there are still dilemmas in nidus segmentation, such as difficulty defining the demarcation of the nidus, observer-dependent variation and time consumption. The aim of this study isto develop an artificial intelligence model to automatically segment the nidus on Time-Of-Flight Magnetic Resonance Angiography (TOF-MRA) images. MethodsA total of 92patients with CAVM who underwent both TOF-MRA and DSA examinations were enrolled. Two neurosurgeonsmanually segmented the nidusonTOF-MRA images,which were regarded as theground-truth reference. AU-Net-basedAImodelwascreatedfor automatic nidus detectionand segmentationonTOF-MRA images. ResultsThe meannidus volumes of the AI segmentationmodeland the ground truthwere 5.427 ± 4.996 and 4.824 ± 4.567 mL,respectively. The meandifference in the nidus volume between the two groups was0.603 ± 1.514 mL,which wasnot statisticallysignificant (P = 0.693). The DSC,precision and recallofthe testset were 0.754 ± 0.074, 0.713 ± 0.102 and 0.816 ± 0.098, respectively. The linear correlation coefficient of the nidus volume betweenthesetwo groupswas 0.988, p < 0.001. ConclusionThe performance of the AI segmentationmodel is moderate consistent with that of manual segmentation. This AI model has great potential in clinical settings, such as preoperative planning, treatment efficacy evaluation, riskstratification and follow-up.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.