Abstract

Accurate segmentation of the fetal head and pubic symphysis in intrapartum ultrasound images and measurement of fetal angle of progression (AoP) are critical to both outcome prediction and complication prevention in delivery. However, due to poor quality of perinatal ultrasound imaging with blurred target boundaries and the relatively small target of the public symphysis, fully automated and accurate segmentation remains challenging. In this paper, we propse a dual-path boundary-guided residual network (DBRN), which is a novel approach to tackle these challenges. The model contains a multi-scale weighted module (MWM) to gather global context information, and enhance the feature response within the target region by weighting the feature map. The model also incorporates an enhanced boundary module (EBM) to obtain more precise boundary information. Furthermore, the model introduces a boundary-guided dual-attention residual module (BDRM) for residual learning. BDRM leverages boundary information as prior knowledge and employs spatial attention to simultaneously focus on background and foreground information, in order to capture concealed details and improve segmentation accuracy. Extensive comparative experiments have been conducted on three datasets. The proposed method achieves average Dice score of 0.908 ±0.05 and average Hausdorff distance of 3.396 ±0.66 mm. Compared with state-of-the-art competitors, the proposed DBRN achieves better results. In addition, the average difference between the automatic measurement of AoPs based on this model and the manual measurement results is 6.157 °, which has good consistency and has broad application prospects in clinical practice.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.