Abstract
Stress and Emotion are complex phenomena that play significant roles in the quality of human life. Emotion plays a major role in motivation, perception, cognition, creativity, attention, learning and decision-making (Seymour et al., 2008). A major problem in understanding emotion is the assessment of the definition of emotions. In fact, even psychologists have problem agreeing on what is considered an emotion and how many types of emotions exist. Kleinginna gathered and analyzed 92 definitions of emotion from literature present that day. He concludes that Emotion is a complex set of interactions among subjective and objective factors, mediated by neural/hormonal systems (Horlings, 2008). In fact, Emotion is a subcategory of stress. A lot of research has been undertaken in assessment of stress and emotion over the last years. Most of researches in the domain of stress and emotional states use peripheral signals such as respiratory rate, Skin Conductance (SC), Blood Volume Pulse (BVP) (Zhai et al., 2006) and Temperature (McFarland, 1985). Most previous research, have investigated the use of EEG and peripheral signals separately, but little attention has been paid so far to the fusion between EEG and peripheral signals (Chanel, 2009; Chanel et al., 2009; Hosseini, 2009). In one study, Aftanas et al. (2004) that showed significant differentiation of arousal based on EEG data collected from participants watching high, intermediate and low arousal images. Chanel (2009) asked the participants to remember past emotional episodes, and obtained the accuracy of 88% using EEG for 3 categories with Support Vector Machine (SVM) classifier. Hosseini et al. (2009) used the induction visual images based acquisition protocol for recording the EEG and peripheral signals under 2 categories of emotional stress states (Calm-neutral and Negatively-exited) of participants, and obtained the accuracy of 78.3% using EEG signals with SVM classifier. Kim et al. (2004) used the combination of music and story as stimuli and there were 50 participants, to introduce a user independent system, the results showed the accuracy of 78.4% and 61% for 3 and 4 categories of different emotions respectively. Takahashi (2004) used film clips to stimulate participants with five different emotions, resulting in 42% of correctly identified patterns. Schaaff & Schultz (2009) used pictures from the International Affective Picture System (IAPS) to induce three emotional states: pleasant, neutral, and unpleasant. They obtained the accuracy of 66.7% for three classes of emotion, solely based on EEG signals.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.