Abstract

Embodied interactive virtual characters, such as virtual humans or animals, have been actively used for various Virtual/Augmented / Mixed Reality (VAMR) applications, and researchers have developed different types of embodied virtual characters and studied their effects on the user’s perception and behavior. This tutorial aims to provide the audience with background knowledge on research in embodied interactive virtual characters and how to develop such interactive characters for their specific applications, particularly focusing on human-in-the-loop systems (Wizard of Oz paradigm). The tutorial will first explore the prior interactive virtual character research focusing on the social influence of these entities over the users, e.g., the sense of social presence, trust, collaboration, while discussing the recent trend of the convergence among IVAs, MR, and Internet of Things (IoT) in the scope of virtual characters interacting with the physical surroundings. We will also share our recent research findings and some lessons from our 5+ years of experience in researching interactive virtual characters and user studies at the Synthetic Reality Lab (SREAL), University of Central Florida (UCF). The tutorial will explain how to develop virtual characters in Unity using 3rd party assets and plugins, such as Mixamo and Rogo Digital’s LipSync. The audience will follow the step-by-step instructions with provided materials and eventually have a simple interactive virtual character that they can control through conventional 2D user interfaces, considering human-in-the-loop studies. The tutorial will also explain how to develop a sensing module to understand the current state of the surrounding environment, which can make a realistic connection between the physical and the virtual worlds. For example, an Arduino board with a couple of sensors, e.g., a wind sensor, can be used to detect the wind in the real environment and trigger the coherent events in the virtual environment, such as blowing a virtual ball on a table.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call