Abstract

This paper investigates whether some well understood principles of human behavioral analysis can be used to design novel paradigms for affective brain–computer/machine interfaces. This is achieved by using the visual, audio, and audiovisual stimuli representing human emotions. The analysis of brain responses to such stimuli involves several challenges related to the conditioning of brain electrical responses, extraction of the responses to stimuli and mutual information between the several physiological recording modalities used. This is achieved in the time–frequency domain, using multichannel empirical mode decomposition (EMD), which proves very accurate in the joint analysis of neurophysiological and peripheral body signals. Our results indicate the usefulness of such an approach and confirm the possibility of using affective brain–computer/machine interfaces.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.