Abstract
Automatic computerized segmentation of fetal head from ultrasound images and head circumference (HC) biometric measurement is still challenging, due to the inherent characteristics of fetal ultrasound images at different semesters of pregnancy. In this paper, we proposed a new deep learning method for automatic fetal ultrasound image segmentation and HC biometry: deeply supervised attention-gated (DAG) V-Net, which incorporated the attention mechanism and deep supervision strategy into V-Net models. In addition, multi-scale loss function was introduced for deep supervision. The training set of the HC18 Challenge was expanded with data augmentation to train the DAG V-Net deep learning models. The trained models were used to automatically segment fetal head from two-dimensional ultrasound images, followed by morphological processing, edge detection, and ellipse fitting. The fitted ellipses were then used for HC biometric measurement. The proposed DAG V-Net method was evaluated on the testing set of HC18 (n = 355), in terms of four performance indices: Dice similarity coefficient (DSC), Hausdorff distance (HD), HC difference (DF), and HC absolute difference (ADF). Experimental results showed that DAG V-Net had a DSC of 97.93%, a DF of 0.09 ± 2.45mm, an AD of 1.77 ± 1.69mm, and an HD of 1.29 ± 0.79mm. The proposed DAG V-Net method ranks fifth among the participants in the HC18 Challenge. By incorporating the attention mechanism and deep supervision, the proposed method yielded better segmentation performance than conventional U-Net and V-Net methods. Compared with published state-of-the-art methods, the proposed DAG V-Net had better or comparable segmentation performance. The proposed DAG V-Net may be used as a new method for fetal ultrasound image segmentation and HC biometry. The code of DAG V-Net will be made available publicly on https://github.com/xiaojinmao-code/ .
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.