Abstract

Segmentation of anatomical structures from ultrasound images requires the expertise of an experienced clinician, but developing a machine automated segmentation process is complicated because of the existence of characteristic artifacts. In this article, we present a novel end-to-end network that enables automated measurements of the fetal head circumference (HC) and fetal abdomen circumference (AC) to be made from 2-dimensional (2D) ultrasound images during each pregnancy trimester. These measurements are necessary, because the HC and AC are used to predict gestational age and to monitor fetal growth. Automated HC and AC assessments are valuable for providing independent and objective results and are particularly useful for application in developing countries where trained sonographers are in short supply. We propose a scale attention expanding network that builds a feature pyramid inside the network, and the intermediate result of each scale is then concatenated to the feature with a fusion scheme for the next layer. Furthermore, a scale attention module is proposed for selecting the most useful scale and for reducing scale noise. To optimize the network, a deep supervision method based on boundary attention is employed. Results of experiments show that the scale attention expanding network obtained an absolute difference, Hausdorff distance, and dice similarity coefficient of 1.81 ± 1.69%, 1.22 ± 0.77%, and 97.94%, respectively, which were top results in the HC18 data set, and respective results on the abdomen set were 2.23 ± 2.38%, 0.42 ± 0.56%, and 98.04%. The experiments conducted demonstrate that our method provides a superior performance to existing fetal ultrasound segmentation methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.