Abstract

We propose a facial expression mapping technology between virtual avatars and Head-Mounted Display (HMD) users. HMDs allow people to enjoy an immersive Virtual Reality (VR) experience. A virtual avatar can be a representative of the user in the virtual environment. However, the synchronization of the virtual avatar's expressions with those of the HMD user is limited. The major problem of wearing an HMD is that a large portion of the user's face is occluded, making facial recognition difficult in an HMD-based virtual environment. To overcome this problem, we propose a facial expression mapping technology using photo-reflective sensors. The sensors attached inside the HMD measure the reflection intensity between the sensors and the user's face. The intensity values of five basic facial expressions (Neutral, Happy, Angry, Surprised, and Sad) are used for training a classifier to estimate the facial expression of a user. In Siggraph 2019, the user can enjoy two application, the facial expression synchronization with the avatar, and simple manipulation experience for a virtual environment by facial expressions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call