This study aims to develop a multimodal deep learning-based algorithm for detecting specific fetal heart rate (FHR) events, to enhance automatic monitoring and intelligent assessment of fetal well-being. We analyzed FHR and uterine contraction signals by combining various feature extraction techniques, including morphological features, heart rate variability features, and nonlinear domain features, with deep learning algorithms. This approach enabled us to classify four specific FHR events (bradycardia, tachycardia, acceleration, and deceleration) as well as four distinct deceleration patterns (early, late, variable, and prolonged deceleration). We proposed a multi-model deep neural network and a pre-fusion deep learning model to accurately classify the multimodal parameters derived from Cardiotocography signals. These accuracy metrics were calculated based on expert-labeled data. The algorithm achieved a classification accuracy of 96.2 % for acceleration, 94.4 % for deceleration, 90.9 % for tachycardia, and 85.8 % for bradycardia. Additionally, it achieved 67.0 % accuracy in classifying the four distinct deceleration patterns, with 80.9 % accuracy for late deceleration and 98.9 % for prolonged deceleration. The proposed multimodal deep learning algorithm serves as a reliable decision support tool for clinicians, significantly improving the detection and assessment of specific FHR events, which are crucial for fetal health monitoring.
Read full abstract