Abstract

AbstractWith the stressful environment of day to day life, pressure in the corporate world and challenges in the educational institutes, more and more children and adults alike are affected by lifestyle diseases. The Identification of the emotional state or stress level of a person has been accepted as an emerging research topic in the domain of Human Machine Interfacing (HMI) as well as psychiatry. The speech has received increased focus as a modality from which reliable information on emotion can be automatically detected. Stress causes variation in the speech produced, which can be measured as negative emotion. If this negative emotion continues for a longer period, it may bring havoc in the life of a person either physically or psychologically. The paper discusses the identification of stress by recognising the emotional state of a person. Herein, four approaches for automatic Emotion Recognition are implemented and their performances such as accuracy and computation time are compared. First approach is Stress/Emotion recognition based on Mel-Frequency Cepstral coefficients (MFCC) feature with Lib-SVM classifier. In other approaches, Vector Quantization (VQ) based clustering technique is used for feature extraction. Three algorithms based on VQ have been explored: (a) Linde-Buzo-Gray (LBG) algorithm, (b) Kekre’s Fast Codebook Generation (KFCG) algorithm (c) Modified KFCG. The result obtained indicates that VQ based features perform better in comparison to MFCC, while KFCG modified algorithm gives further better results. The Surrey Audio-Visual Expressed Emotion (SAVEE) database of seven universal emotions and ENTERFACE database with six emotions is used to train and test the multiclass SVM.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.