Abstract

Our study proposes a new local model to accurately control an avatar using six inertial sensors in real-time. Creating such a system to assist interactive control of a full-body avatar is challenging because control signals from our performance interfaces are usually inadequate to completely determine the whole body movement of human actors. We use a pre-captured motion database to construct a group of local regression models, which are used along with the control signals to synthesize whole body human movement. By synthesizing a variety of human movements based on actors’ control in real-time, this study verifies the effectiveness of the proposed system. Compared with the previous models, our proposed model can synthesize more accurate results. Our system is suitable for common use because it is much cheaper than commercial motion capture systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call