Abstract
Volumetric placental measurement using 3-D ultrasound has proven clinical utility in predicting adverse pregnancy outcomes. However, this metric cannot currently be employed as part of a screening test due to a lack of robust and real-time segmentation tools. We present a multiclass (MC) convolutional neural network (CNN) developed to segment the placenta, amniotic fluid, and fetus. The ground-truth data set consisted of 2093 labeled placental volumes augmented by 300 volumes with placenta, amniotic fluid, and fetus annotated. A two-pathway, hybrid (HB) model using transfer learning, a modified loss function, and exponential average weighting was developed and demonstrated the best performance for placental segmentation (PS), achieving a Dice similarity coefficient (DSC) of 0.84- and 0.38-mm average Hausdorff distances (HDAV). The use of a dual-pathway architecture improved the PS by 0.03 DSC and reduced HDAV by 0.27 mm compared with a naïve MC model. The incorporation of exponential weighting produced a further small improvement in DSC by 0.01 and a reduction of HDAV by 0.44 mm. Per volume inference using the FCNN took 7-8 s. This method should enable clinically relevant morphometric measurements (such as volume and total surface area) to be automatically generated for the placenta, amniotic fluid, and fetus. The ready availability of such metrics makes a population-based screening test for adverse pregnancy outcomes possible.
Submitted Version (Free)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Ultrasonics, Ferroelectrics, and Frequency Control
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.