Abstract

In the case that a real operator drives a virtual human in real-time using the motion capture method and performs complex products assembling and disassembling simulation, a very high driven accuracy is needed to meet the quality requirements of interactivity and simulation results. In order to improve the driven accuracy in virtual reality environment, a method is put forward which analyzes the influence factors of virtual human real-time driven accuracy and optimize the factors. A systematical analysis of factors affecting the accuracy is given. The factors can be sorted into hardware factors and software factors. We find out that the software factors are the main ones affecting the accuracy, and it is very hard to analyse their influence separately. Therefore, we take the virtual human kinematic system as a fuzzy system and improve the real-time driven accuracy using an optimization method. Firstly, a real-time driven model is built on dynamic constraints and body joint rotation information and supports personalized human driven. Secondly, a function is established to describe the driven error during interactive operations in the virtual environment. Then, based on the principle of minimum cumulative error, we establish an optimization model with a specified optimization zone and constraints set according to the standard Chinese adult dimensions. Next, the model is solved using genetic algorithm to get the best virtual human segment dimensions matching the real operator. Lastly, the method is verified with an example of auto engine virtual assembly. The result shows that the method can improve the driven accuracy effectively.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.