Abstract
The study of infant cry recognition aims to identify what an infant needs through her cry. Different crying sound can give a clue to caregivers about how to response to the infant’s needs. Appropriate responses on infant cry may influence emotional, behavioral, and relational development of infant while growing up. From a pattern recognition perspective, recognizing particular needs or emotions from an infant cry is much more difficult than recognizing emotions from an adult’s speech because infant cry usually does not contain verbal information. In this paper, we study the problem of classifying five different types emotion or needs expressed by infant cry, namely hunger, sleepiness, discomfort, stomachache, and indications that the infant wants to burp. We propose a novel approach using a combination of Convolutional Neural Networks (CNN) and Recurrent Neural Networks (RNN) that acts as feature extraction and classifier method at once. Particularly, CNN learns salient features from raw spectrogram information and RNN learns temporal information of CNN obtained features. We also apply 5-folds cross-validation on 200 training data set and 50 validation data set. The model with the best weight is tested on 65 test set. Evaluation in Dunstan Baby Language dataset shows that our CNN-RNN model outperforms the previous method by average classification accuracy up to 94.97%. The encouraging result demonstrates that the application of CNN-RNN and 5-folds cross-validation offers accurate and robust result.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.