Abstract

Introduction: Diastasis recti abdominis (DRA) is a common condition in postpartum women. Measuring the distance between separated rectus abdominis (RA) in ultrasound images is a reliable method for the diagnosis of this disease. In clinical practice, the RA distance in multiple ultrasound images of a patient is measured by experienced sonographers, which is time-consuming, labor-intensive, and highly dependent on experience of operators. Therefore, an objective and fully automatic technique is highly desired to improve the DRA diagnostic efficiency. This study aimed to demonstrate the deep learning-based methods on the performance of RA segmentation and distance measurement in ultrasound images. Methods: A total of 675 RA ultrasound images were collected from 94 postpartum women, and were split into training (448 images), validation (86 images), and test (141 images) datasets. Three segmentation models including U-Net, UNet++ and Res-UNet were evaluated on their performance of RA segmentation and distance measurement. Results: Res-UNet model outperformed the other two models with the highest Dice score (85.93% ± 0.26%), the highest MIoU score (76.00% ± 0.39%) and the lowest Hausdorff distance (21.80 ± 0.76mm). The average physical distance between RAs measured from the segmentation masks generated by Res-UNet and that measured by experienced sonographers was only 3.44 ± 0.16mm. In addition, these two measurements were highly correlated with each other (r = 0.944), with no systematic difference. Conclusion: Deep learning model Res-UNet has good reliability in RA segmentation and distance measurement in ultrasound images, with great potential in the clinical diagnosis of DRA.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.