Abstract

Virtual tridimensional creatures are active actors in many types of applications nowadays, such as virtual reality, games and computer animation. The virtual actors encountered in those applications are very diverse, but usually have humanlike behavior and facial expressions. This paper deals with the mapping of facial expressions between virtual characters, based on anthropometric proportions and geometric manipulations by moving influence zones. Facial proportions of a base model is used to transfer expressions to any other model with similar global characteristics (if the base model is a human, for instance, the other models need to have two eyes, one nose and one mouth). With this solution, it is possible to insert new virtual characters in real-time applications without having to go through the tedious process of customizing the characters’ emotions.

Highlights

  • Virtual tridimensional creatures are active actors in many types of applications nowadays, such as virtual reality, games and computer animation

  • The virtual actors encountered in those applications are very diverse, but usually have human-like behavior and facial expressions

  • The adopted model of influence zones presents some limitations which will be discussed later. In their Expression Cloning work, Noh and Neumann [13] applied similar deformation techniques to transfer different types of expressions between two virtual characters with distinct mesh topology, serving as a base for many other works [10], [14], [16], [17], [18], [19], [20]. Some of these works used models with different facial features, similar to Noh and Neumann [13], and some used only human models, but, in order to transfer the expressions it was necessary to find dense correspondences between the models using volume morphing and a cylindrical projection to apply the deformations by Radial Basis Functions (RBF)

Read more

Summary

INTRODUCTION

Virtual tridimensional creatures are active actors in many types of applications nowadays, such as virtual reality, games and computer animation. Facial proportions of a base model is used to transfer expressions to any other model with similar global characteristics (if the base model is a human, for instance, the other models need to have two eyes, one nose and one mouth) using a simple and intuitive deformation system. With this solution, it is possible to insert new virtual characters in real-time applications without having to go through the tedious process of customizing the characters’ emotions.

RELATED WORK
MEASURES MANIPULATION
CASE STUDY
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call