Abstract

Accurate gestational age (GA) prediction is crucial for monitoring fetal development and ensuring optimal prenatal care. Traditional methods often face challenges in terms of precision and prediction efficiency. In this context, leveraging modern deep learning (DL) techniques is a promising solution. This paper introduces a novel DL approach for GA prediction using fetal brain images obtained via magnetic resonance imaging (MRI), which combines the strength of the Xception pretrained model with a multihead attention (MHA) mechanism. The proposed model was trained on a diverse dataset comprising 52,900 fetal brain images from 741 patients. The images encompass a GA ranging from 19 to 39 weeks. These pretrained models served as feature extraction components during the training process. The extracted features were subsequently used as the inputs of different configurable MHAs, which produced GA predictions in days. The proposed model achieved promising results with 8 attention heads, 32 dimensionality of the key space and 32 dimensionality of the value space, with an R-squared (R2) value of 96.5 %, a mean absolute error (MAE) of 3.80 days, and a Pearson correlation coefficient (PCC) of 98.50 % for the test set. Additionally, the 5-fold cross-validation results reinforce the model's reliability, with an average R2 of 95.94 %, an MAE of 3.61 days, and a PCC of 98.02 %. The proposed model excels in different anatomical views, notably the axial and sagittal views. A comparative analysis of multiple planes and a single plane highlights the effectiveness of the proposed model against other state-of-the-art (SOTA) models reported in the literature. The proposed model could help clinicians accurately predict GA.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.