Abstract

Virtual characters are now widely used in games, computer-generated (CG) movies, virtual reality (VR), and communication media. The continued technological innovations in motion capture mean that a more natural representation of a three-dimensional character’s motion should be achievable. Many researchers have investigated how virtual characters interact with their surrounding environment through spatial relationships, which were introduced for adapting and preserving character motion. However, technical problems should be resolved to enable the control of characters in augmented reality (AR) environments that combine with the real world, and this can be achieved by adapting motion to environmental differences using original motion datasets. In this paper, we investigate a novel method for preserving automatic motion adaptation for a virtual character in AR environments. We used specific object (e.g., puddle) recognition and the spatial properties of the user’s surrounding space, e.g., object types and positions, and ran validation experiments to provide accurate motion to improve the AR experience. Our experimental study showed positive results in terms of smooth motion in AR configurations. We also found that the participants using AR felt a greater sense of co-presence with the character through adapted motion.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call