Abstract

In recent years, the rapid development of diverse media has been evident in disparate fields such as consumer electronics, automotive infotainment and healthcare software. There is a need for innovative methods to assess user perceived Quality of Experience (QoE), as a proxy for consumer satisfaction with such systems and services. Users emotional state plays a key role in QoE; thus, it is necessary to consider it in user experience evaluation and the design process of stereoscopic 3D video content. In the present article we introduce the use of a specially designed model based on a feedforward Multilayer Perception Artificial Neural Network as an appropriate Machine Learning technique for the estimation of human emotional state while viewing various categories of stereoscopic 3D video content. The goal is to design an emotional state estimator based on direct psychophysiological measurements. The considered psychophysiological signals include heart rate (HR) calculated from an echocardiogram (ECG), electro-dermal activity (EDA), and brain activity (BA) in EEG signals. Participants watched a series of 3D video contents varying in terms of visual quality, while the mentioned psychophysiological signals were recorded, and self-reported subjectively experienced emotions using a Self-Assessment Manikin (SAM) questionnaire. The obtained results show that it is possible to construct such a highly precise estimator of emotional states.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call