Abstract
In order to improve the welfare of newborns, this study investigates the use of sound-recognition-based artificial intelligence (AI) approaches to the interpretation and monitoring of infant screams. Crying has long been a problem because it is the primary means of communication between infants and caregivers. The limitations of conventional interpretation techniques are discussed. These limitations include the subjective nature of interpretation and the inability to detect subtle variations in crying patterns. The goal of the research is to categorize crying patterns based on the cries of male and female infants and identify noises that are a sign of distress. The study utilized the Mel Frequency Cepstral Coefficients (MFCC) method to extract features from internet-sourced MP3 and WAV audio data. The technique successfully captured the unique qualities of each crying sound using various machine-learning models, including Random Forest and XGBoost. These models outperformed others with accuracy rates of 94.5% and 94.2%, respectively. These findings show how well these algorithms perform in correctly categorizing various newborn cries. The findings of this study establish the platform for possible Internet of Things (IoT) and healthcare framework implementations targeted at supporting parents in caring for their newborns by offering an insightful understanding of the distinctive vocalizations connected with weeping.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: JOURNAL OF MECHANICS OF CONTINUA AND MATHEMATICAL SCIENCES
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.