Abstract

You have accessJournal of UrologyImaging/Radiology: Uroradiology II (MP42)1 Apr 2020MP42-05 AUTOMATIC SEGMENTATION AND 3D VISUALIZATION OF PELVIC MESH USING MATHEMATICAL MODELLING AND MACHINE LEARNING TECHNIQUES IN MRI Gaik Ambartsoumian*, Souvik Roy, Gaurav Khatri, and Philippe E. Zimmern Gaik Ambartsoumian*Gaik Ambartsoumian* More articles by this author , Souvik RoySouvik Roy More articles by this author , Gaurav KhatriGaurav Khatri More articles by this author , and Philippe E. ZimmernPhilippe E. Zimmern More articles by this author View All Author Informationhttps://doi.org/10.1097/JU.0000000000000891.05AboutPDF ToolsAdd to favoritesDownload CitationsTrack CitationsPermissionsReprints ShareFacebookLinked InTwitterEmail Abstract INTRODUCTION AND OBJECTIVE: Pre-operative imaging to localize and measure previously placed synthetic pelvic implants, such as mid-urethral slings and/or trans-vaginal meshes, is an important task in modern pelvic floor reconstruction [1]. Therefore, our aim is to develop new mathematical models and machine learning techniques for automatic detection and visualization of pelvic mesh in MRI. METHODS: Some of the most efficient modern methods of automatic image segmentation use machine learning techniques requiring massive training data. A realistic yet efficient option of acquiring large training data sets is the generation of synthetic data using mathematical models optimized for the specific task at hand [2]. Utilizing the expertise of our diverse group comprised of applied mathematicians, radiologists and pelvic floor specialists, we de-identified human MR images from 10 patients (several hundred 2D images per patient) collected by the medical team and transferred to the mathematicians in DICOM format as part of inter-institutional Material Transfer Agreement. Images from 5 women were manually segmented and labeled by the math team, verified by the radiologist, and are currently used as source of data augmentation (i.e. for generating additional synthetic training data). This augmented data set was applied to train a Convolutional Neural Network (CNN) to analyze our MR images. The MR images from the remaining 5 patients were then be used as validation and test data sets. RESULTS: Based on 2D image segmentation and labeling, we have established a framework for enhanced 3D stereo-metric visualization of MR images (figure), which is more desirable for physicians than slice-by-slice image sequences provided by conventional DICOM viewers. Using machine learning techniques, we are currently developing a new mathematical apparatus for automatic segmentation of MR images of pelvic floor structures, including pelvic implants. CONCLUSIONS: This interdisciplinary project on mathematical approaches to automatic segmentation and 3D visualization of MRI of pelvic floor structures yielded very encouraging results. Comparison with intra-operative findings during mesh removal procedures will be needed to validate these results. Source of Funding: None © 2020 by American Urological Association Education and Research, Inc.FiguresReferencesRelatedDetails Volume 203Issue Supplement 4April 2020Page: e605-e605 Advertisement Copyright & Permissions© 2020 by American Urological Association Education and Research, Inc.MetricsAuthor Information Gaik Ambartsoumian* More articles by this author Souvik Roy More articles by this author Gaurav Khatri More articles by this author Philippe E. Zimmern More articles by this author Expand All Advertisement PDF downloadLoading ...

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.