Abstract
The aim of this article is to classify chil- dren's affective states in a real-life non-prototypical emo- tion recognition scenario. The framework is the same as that proposed in the Interspeech 2009 Emotion Chal- lenge. We used a large set of acoustic features and five linguistic parameters based on the concept of emotional salience. Features were extracted from the spontaneous speech recordings of the FAU Aibo Corpus and their transcriptions. We used a wrapper method to reduce the acoustic set of features from 384 to 28 elements and feature-level fusion to merge them with the set of linguistic parameters. We studied three classification approaches: a Na¨ ive-Bayes classifier, a support vector machine and a logistic model tree. Results show that the linguistic features improve the performances of the classifiers that use only acoustic datasets. Addition- ally, merging the linguistic features with the reduced acoustic set is more effective than working with the full dataset. The best classifier performance is achieved with the logistic model tree and the reduced set of acoustic and linguistic features, which improves the per- formance obtained with the full dataset by 4.15% ab- solute (10.14% relative) and improves the performance of the Na¨ ive-Bayes classifier by 9.91% absolute (28.18% relative). For the same conditions proposed in the Emo- tion Challenge, this simple scheme slightly improves a much more complex structure involving seven classifiers and a larger number of features.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.