Abstract
Enhanced sensation of reality from multimedia contents can be achieved by creating realistic multimedia environments, using visual, auditory, and olfactory information. Although the affective information from video and audio has been extensively studied, the olfactory sense has received less attention. A way to assess human experience from audio, video or odors, is by investigating physiological signals. In this study, 23 subjects experienced pleasant, unpleasant, and neutral odors while their electroencephalogram (EEG), and electrocardiogram (ECG) were recorded. Two independent three-class classifiers were trained and tested, using EEG or ECG features. The results reveal a significant increase in the classification performance when EEG features were used (Cohen's kappa $k = 0.44\pm 0.14, p ). The results also indicate that it is possible to automatically classify the perception of unpleasant odors using EEG signals, but the classification performance decreases significantly when classifying between pleasant and neutral odors. Among the EEG features, the Wasserstein distance metric estimated between trial and baseline power achieved the highest classification performance. Features from ECG signals did not result in a significantly non-random performance.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.