Abstract

Virtual humans are more and more used in VR applications, but their animation is still a challenge, especially if complex tasks must be carried out in interaction with the user. In many applications with virtual humans, credible virtual characters play a major role in presence. Motion editing techniques assume that the natural laws are intrinsically encoded in prerecorded trajectories and that modifications may preserve these natural laws, leading to credible autonomous actors. However, a complete knowledge of all the constraints is required to ensure continuity or to synchronize and blend several actions necessary to achieve a given task. We propose a framework capable of performing these tasks in an interactive environment that can change at each frame, depending on the user’s orders. This framework enables VR applications to animate from dozens of characters in real time for complex constraints, to hundreds of characters if only ground adaptation is performed. It offers the following capabilities: motion synchronization, blending, retargeting, and adaptation thanks to enhanced inverse kinetics and kinematics solver. To evaluate this framework, we have compared the motor behavior of subjects in real and in virtual environments.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.