Abstract

Due to increasingly and massive use of simulation for studying man-workplace interaction, virtual design and CAD systems will be more and more concerning with virtual actors able to emulate the real human motion, with high resemblance. In this work we present some results related to the attempt of synthetically generating human like motion trajectories to drive a model of a human arm. The problem of producing human-like trajectories of complex articulated structures has been re-framed as the issue of computing the parameters of a neural network through a large redundant data set, without involving a mathematical model of either direct or inverse kinematics. In particular, reaching and pointing motor tasks have been considered and tested on a human model. The kinematics has been described in terms of joint angles computed from a set of 3D trajectories of markers, located onto the subject and acquired by means of an opto-electronic motion analyzer. Our approach for movement simulation has been based on a multi-layer perceptron able to predict the trajectory of an arm (three sticks model) in term of joint angles by specifying only the starting and the final position (3D coordinates) of the end-effector. Results, in term of residual errors in training and extrapolation properties, show the reliability of the proposed method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call