Abstract

In this work we combine time, spectral and cepstral features of the signal captured in a tablet to characterize depression, anxiety, and stress emotional state recognition on the EMOTHAW database. EMOTHAW contains the emotional states of users represented by capturing signals from sensors on the tablet and pen when the user is performing 3 specific handwriting and 4 drawing tasks, which had been categorized into depressed, anxious, stressed, and typical, according to the Depression, Anxiety and Stress Scale (DASS). Each user was characterized with six time-domain features, and the number of spectral-domain and cepstral-domain features for the horizontal and vertical displacement of the pen, the pressure on the paper, and the time spent on-air and off-air, depended on the configuration of the filterbank. As next step, we select the best features using the Fast Correlation-Based Filtering method. Since our dataset has 129 users, then as next step, we augmented the training data by randomly selecting a percentage of the training data and adding a small random Gaussian noise to the extracted features. We then train a radial basis SVM model using the Leave-One-Out (LOO) methodology. The experimental results show an average accuracy classification improvement ranging of 15%, and an accuracy classification improvement ranging from 4% to 34% compared with baseline (state of the art) for specific emotions such as depression, anxiety, stress, and typical emotional states.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.