Abstract

AbstractLive performance is an intuitive way to naturally draft the desired motion in the choreographer's mind. In this paper we present a novel approach to choreographing motions by live performance captured with degree of freedom (3‐DOF) accelerometers. The process begins by placing the accelerometers on the user's limbs according to the pre‐specified positions. The computer then recognizes the performed actions using Hidden Markov Model (HMM), which is pre‐trained by the acceleration data samples automatically generated from a pre‐segmented motion capture database. At last, the captured actions are further synthesized with motion retiming and exaggeration based on the acceleration signals from the accelerometers. This method can intuitively rapid‐prototype the choreographed motions for pre‐production of animation, the avatar control in virtual reality and game‐like scenarios, etc. The experimental results show that it can effectively recognize actions with spatial‐time variance, and is easy‐to‐use especially for a novice with little experience. Copyright © 2009 John Wiley & Sons, Ltd.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.