Abstract

We propose a facial expression mapping technology between virtual avatars and Head-Mounted Display (HMD) users. HMDs allow people to enjoy an immersive Virtual Reality (VR) experience. A virtual avatar can be a representative of the user in the virtual environment. However, the synchronization of the virtual avatar's expressions with those of the HMD user is limited. The major problem of wearing an HMD is that a large portion of the user's face is occluded, making facial recognition difficult in an HMD-based virtual environment. To overcome this problem, we propose a facial expression mapping technology using photo-reflective sensors. The sensors attached inside the HMD measure the reflection intensity between the sensors and the user's face. The intensity values of five basic facial expressions (Neutral, Happy, Angry, Surprised, and Sad) are used for training a classifier to estimate the facial expression of a user. In Siggraph 2019, the user can enjoy two application, the facial expression synchronization with the avatar, and simple manipulation experience for a virtual environment by facial expressions.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.