Abstract

Human motion tracking is a powerful tool used in a large range of applications that require human movement analysis. Although it is a well-established technique, its main limitation is the lack of estimation of real-time kinetics information such as forces and torques during the motion capture. In this paper, we present a novel approach for a human soft wearable force tracking for the simultaneous estimation of whole-body forces along with the motion. The early stage of our framework encompasses traditional passive marker based methods, inertial and contact force sensor modalities and harnesses a probabilistic computational technique for estimating dynamic quantities, originally proposed in the domain of humanoid robot control. We present experimental analysis on subjects performing a two degrees-of-freedom bowing task, and we estimate the motion and kinetics quantities. The results demonstrate the validity of the proposed method. We discuss the possible use of this technique in the design of a novel soft wearable force tracking device and its potential applications.

Highlights

  • Human whole-body motion tracking is nowadays a well-established tool in the analysis of human movements

  • Combinations of different technologies have been used for detecting human motion: in [6], a video-based motion technique was adopted for capturing realistic human motion from video sequences

  • Inspired by a recent research study on sensor fusion for whole-body estimation on the humanoid iCub robot [10] (Istituto Italiano di Tecnologia, Genova, Italy), in this paper, we propose a novel framework for wearable dynamics (WearDY) aiming to bridge the gap in dynamics analysis by fusing motion and force capture

Read more

Summary

Introduction

Human whole-body motion tracking is nowadays a well-established tool in the analysis of human movements. The importance of including and exploiting all dynamic information is a crucial point in several research areas such as ergonomics for industrial scenarios, developing prosthetic devices and exoskeleton systems in rehabilitation fields, or in human-robot interaction For these reasons, whole-body force tracking is not a new challenge for the scientific community, but the topic has been seldom explored in situ due to the computational difficulties of the analysis and even more rarely analyzed in real-time modality. Inspired by a recent research study on sensor fusion for whole-body estimation on the humanoid iCub robot [10] (Istituto Italiano di Tecnologia, Genova, Italy), in this paper, we propose a novel framework for wearable dynamics (WearDY) aiming to bridge the gap in dynamics analysis by fusing motion and force capture.

Spatial Algebra Description
Stochastic Notation
Problem Statement and Formulation
Recursive Newton–Euler Algorithm
RNEA Matrix Formulation and the Measurements Equation
Considerations on the Representation
Over Constrained RNEA
Experimental Set-up
Human Body Modeling
Proof of Concept
Method Robustness Test
Conclusions
: References
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.