Abstract

The study of emotional body language has been the effort of many scientists for more than 200 years, from areas such as psychology, neuroscience, biology, and others. A lot of work has focused on the analysis of the kinematics, while the study of the underlying dynamics is still largely unexplored. In this thesis we model human walking as a nonlinear multi-phase optimal control problem to investigate the dynamics of full-body emotional expressions in human locomotion. Our approach is based on rigid multibody dynamics, a highly parameterized mathematical model of the human locomotion system, and the direct multiple-shooting method to analyze the dynamics of recorded kinematic motion capture data. Modeling the dynamics of a human rigid multibody model results in a set of highly complex differential algebraic equations that require automated methods to derive and evaluate. We created a new rigid multibody dynamics software package to model and numerically evaluate kinematic and dynamic quantities of rigid multibody systems expressed in generalized coordinates, including modeling of external contacts and discontinuities arising from contact events. Our package evaluates components of the equation of motion for multibody systems using recursive algorithms that are based on Featherstone's 6-D spatial algebra notation. Our package is specifically tailored for the use in numerical optimal control and carefully designed to exploit sparsities and reduction of redundant computations by selectively reusing computed values. By doing so we are able to achieve and partially exceed performance that is otherwise only available with source code generation modeling approaches. We created a highly parameterized 3-D meta model for the human locomotion system. This rigid multibody model is based on biomechanical data for kinematic and inertial parameters and enables us to create subject-specific dynamic models by adjusting segment dimensions, joint locations, and inertial parameters. To describe the contact between the human model and the ground, we created a non-holonomic rigid body contact model specifically for human walking movements that approximates the foot geometry using a sphere for the heel and a line segment at the ball of the foot during forefoot contact. Transforming motion capture marker data to rigid multibody motion is a difficult problem due to unknown joint centers, redundant marker movements, and non-rigid movement of markers as a result of skin and tissue movement. In this thesis, we developed and implemented a semi-automatic method in which we manually adjust the model to approximate the recorded subject and then compute joint angles by solving a non-linear least-squares optimization problem. Our approach is independent of the used motion capture marker set and directly maps onto the joint space of the model. We formulate two types of multi-phase optimal control problems for human walking: an inverse reconstruction problem and a gait synthesis problem that both have the differential equations of the rigid multibody dynamics as a constraint and can be used for different purposes. The reconstruction problem computes the unknown joint actuations from purely kinematic motion capture data. Applied to the recorded motion capture data, the reconstructed joint actuations show emotion specific features that are also found in the recorded muscle activity. This validates our model and approach to use optimal control problems as a tool to study emotional body language in a new way. Our gait synthesis formulation allows the generation of walking motions solely based on mathematical and physical principles. It can be applied in computer animation, robotics, and predictive gait analysis. We have generated a wide range of motions by adjusting objective function and gait parameters. A long-term goal of this formulation is to investigate optimality criteria of emotional walking motions. For this, we aim to use hierarchical optimal control problems in our future works.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call