Abstract

The objective of this study was to develop an automated segmentation method for the anterior cruciate ligament that is capable of facilitating quantitative assessments of the ligament in clinical and research settings. A modified U-Net fully convolutional network model was trained, validated, and tested on 246 Constructive Interference in Steady State magnetic resonance images of intact anterior cruciate ligaments. Overall model performance was assessed on the image set relative to an experienced (>5 years) "ground truth" segmenter in two domains: anatomical similarity and the accuracy of quantitative measurements (i.e., signal intensity and volume) obtained from the automated segmentation. To establish model reliability relative to manual segmentation, a subset of the imaging data was resegmented by the ground truth segmenter and two additional segmenters (A, 6 months and B, 2 years of experience), with their performance evaluated relative to the ground truth. The final model scored well on anatomical performance metrics (Dice coefficient = 0.84, precision = 0.82, and sensitivity = 0.85). The median signal intensities and volumes of the automated segmentations were not significantly different from ground truth (0.3% difference, p = .9; 2.3% difference, p = .08, respectively). When the model results were compared with the independent segmenters, the model predictions demonstrated greater median Dice coefficient (A = 0.73, p = .001; B = 0.77, p = NS) and sensitivity (A = 0.68, p = .001; B = 0.72, p = .003). The model performed equivalently well to retest segmentation by the ground truth segmenter on all measures. The quantitative measures extracted from the automated segmentation model did not differ from those of manual segmentation, enabling their use in quantitative magnetic resonance imaging pipelines to evaluate the anterior cruciate ligament.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.