Abstract

You have accessJournal of UrologyCME1 Apr 2023MP09-05 AUTOMATED PROSTATE GLAND AND PROSTATE ZONES SEGMENTATION USING A NOVEL MRI-BASED MACHINE LEARNING FRAMEWORK AND CREATION OF SOFTWARE INTERFACE FOR USERS ANNOTATION Masatomo Kaneko, GIovanni E. Cacciamani, Yijing Yang, Vasileios Magoulianitis, Jintang Xue, Jiaxin Yang, Jinyuan Liu, Maria Sarah L. Lenon, Passant Mohamed, Darryl H. Hwang, Karan Gill, Manju Aron, Vinay Duddalwar, Suzanne L. Palmer, C.-C. Jay Kuo, Andre Luis Abreu, Inderbir Gill, and Chrysostomos L. Nikias Masatomo KanekoMasatomo Kaneko More articles by this author , GIovanni E. CacciamaniGIovanni E. Cacciamani More articles by this author , Yijing YangYijing Yang More articles by this author , Vasileios MagoulianitisVasileios Magoulianitis More articles by this author , Jintang XueJintang Xue More articles by this author , Jiaxin YangJiaxin Yang More articles by this author , Jinyuan LiuJinyuan Liu More articles by this author , Maria Sarah L. LenonMaria Sarah L. Lenon More articles by this author , Passant MohamedPassant Mohamed More articles by this author , Darryl H. HwangDarryl H. Hwang More articles by this author , Karan GillKaran Gill More articles by this author , Manju AronManju Aron More articles by this author , Vinay DuddalwarVinay Duddalwar More articles by this author , Suzanne L. PalmerSuzanne L. Palmer More articles by this author , C.-C. Jay KuoC.-C. Jay Kuo More articles by this author , Andre Luis AbreuAndre Luis Abreu More articles by this author , Inderbir GillInderbir Gill More articles by this author , and Chrysostomos L. NikiasChrysostomos L. Nikias More articles by this author View All Author Informationhttps://doi.org/10.1097/JU.0000000000003224.05AboutPDF ToolsAdd to favoritesDownload CitationsTrack CitationsPermissionsReprints ShareFacebookLinked InTwitterEmail Abstract INTRODUCTION AND OBJECTIVE: To develop an automated machine learning (ML) model to segment the prostate gland, the peripheral zone (PZ), and the transition zone (TZ) using magnetic resonance image (MRI) and to create a web-based software interface for annotation. METHODS: Consecutive men who underwent prostate MRI followed by prostate biopsy (PBx) were identified from our PBx database (IRB# HS-13-00663). The 3T MRI was performed according to Prostate Imaging-Reporting and Data System (PIRADS) v2 or v2.1. The T2-weighted (T2W) images were manually segmented into the whole prostate, PZ, and TZ by experienced radiologist and urologist. A novel two-stage automatic Green Learning (GL) based machine learning model was designed, which is a novel non-deep learning method. The first stage segments the prostate gland and the second stage zooms into the prostate area to delineate TZ and PZ. Both stages share a lightweight feed-forward encoder-decoder GL system. Included accessions were split for 5-fold cross-validation. The volumes were calculated according to the number of pixels/voxels. The model performance for automated prostate segmentation was evaluated by Dice scores and Pearson correlation coefficients. The web-based software interface was designed and implemented for users to interact with the AI annotation model and make necessary adjustments. RESULTS: A total of 119 patients (19992 T2W images) met the inclusion criteria (Figure 1). Using the training dataset of 95 MRIs, a ML model for whole prostate, PZ, and TZ segmentation was constructed. The mean Dice scores for whole prostate, PZ, and TZ were 0.85, 0.62 and 0.81, respectively. The Pearson correlation coefficient for volumes of whole prostate, PZ, and TZ segmentation were 0.92 (p<0.01), 0.62 (p<0.01), and 0.93 (p<0.01), respectively. The web-based software interface takes a mean of 90sec for prostate segmentation with 168 slices. The platform supports DICOM series upload, image preview, image modification, 3-dimensional preview, and annotation mask export, from any device without migrating data. CONCLUSIONS: A lightweight feed-forward encoder-decoder model based on Green Learning can precisely segment the whole prostate, PZ and TZ. This is available on a user-friendly software interface. Source of Funding: None © 2023 by American Urological Association Education and Research, Inc.FiguresReferencesRelatedDetails Volume 209Issue Supplement 4April 2023Page: e105 Advertisement Copyright & Permissions© 2023 by American Urological Association Education and Research, Inc.MetricsAuthor Information Masatomo Kaneko More articles by this author GIovanni E. Cacciamani More articles by this author Yijing Yang More articles by this author Vasileios Magoulianitis More articles by this author Jintang Xue More articles by this author Jiaxin Yang More articles by this author Jinyuan Liu More articles by this author Maria Sarah L. Lenon More articles by this author Passant Mohamed More articles by this author Darryl H. Hwang More articles by this author Karan Gill More articles by this author Manju Aron More articles by this author Vinay Duddalwar More articles by this author Suzanne L. Palmer More articles by this author C.-C. Jay Kuo More articles by this author Andre Luis Abreu More articles by this author Inderbir Gill More articles by this author Chrysostomos L. Nikias More articles by this author Expand All Advertisement PDF downloadLoading ...

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.