Abstract

This paper presented a preliminary investigation of a novel approach on emotion recognition using pupil position in Virtual Reality (VR). We explore pupil position as an eye-tracking feature for four-class emotion classification according to the four-quadrant model of emotions via a presentation of 360° videos in VR. A total of ten subjects participated in this emotional experiment. A 360° video with four sessions of stimulation of emotions will be presented in VR to evoke the user’s emotions. The eye data were recorded and collected using Pupil Labs eye-tracker and the emotion classification was done by using pupil position solely. The classifier used in this investigation is the Support Vector Machine (SVM) machine learning algorithm. The results showed that the best accuracy achieved from this four-class random classification was 59.19%.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call