Abstract

Magnetic resonance imaging (MRI) plays an important role in assessing pelvic organ prolapse (POP), and automated pelvic floor landmark localization potentially accelerates MRI-based measurements of POP. Herein, we aimed to develop and evaluate a deep learning-based technique for automated localization of POP-related landmarks. Ninety-six mid-sagittal stress MR images (at rest and at maximal Valsalva) were used for deep-learning model training and generalization testing. We randomly split our dataset into a training set of 73 images and a testing set of 23 images. One soft-tissue landmark (the cervical os [P1]) and three bony landmarks (the mid-pubic line [MPL] endpoints [P2&P3] and the sacrococcygeal inferior-pubic point [SCIPP] line endpoints [P3&P4]) were annotated by experts. We used an encoder-decoder structure to develop the deep learning model for automated localization of the four landmarks. Localization performance was assessed using the root square error (RSE), whereas the reference lines were assessed based on the length and orientation differences. We localized landmarks (P1 to P4) with mean RSEs of 1.9mm, 1.3mm, 0.9mm, and 3.6mm. The mean length errors of the MPL and SCIPP line were 0.1 and -2.1mm, and the mean orientation errors of the MPL and SCIPP line were -0.7° and -0.3°. Our method predicted each image in 0.015s. We demonstrated the feasibility of a deep learning-based approach for accurate and fast fully automated localization of bony and soft-tissue landmarks. This sped up the MR interpretation process for fast POP screening and treatment planning.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.