Abstract
Uterine electromyography (uEMG) measures the electrical activity of the uterus noninvasively and is a promising technique for detecting preterm birth. Nevertheless, uterine contractions are irregular during pregnancy and may not present during standard 30-min recording. Hence, this study analyzes the noncontraction of uEMG signals for predicting premature birth. Three channels of 53 and 47 noncontraction segments under the term and preterm conditions, respectively, are obtained from the publicly available database. The signals are preprocessed, and the contractions and noncontraction segments are extracted manually based on the annotations. The Hjorth features, namely activity, mobility, and complexity, are extracted from the signals. Classification algorithms, namely support vector machine, random forest, and adaptive boosting classifier, are designed to distinguish between term and preterm conditions. The results show that mobility decreases, and complexity increases in preterm conditions. The support vector machine based on the proposed features of a single channel yields a maximum accuracy of 84.3% and F1-score of 82.8% in differentiating term and preterm conditions. In order to improve the performance further, we adapted a decision fusion approach that combines predictions from multiple channels. The improved model enhances the accuracy and F1-score by about 3%. Therefore, it appears that the proposed approach using noncontraction segments could be used as a biomarker for the reliable prediction of premature birth.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.