Abstract

This paper presents a novel approach toward representing human whole-body motions with fingers and classification of human motions while performing tasks that require delicate finger movements, such as holding or grasping. Human whole-body motions are recorded using an optical motion capture system that measures positions of markers attached to a performer. Additionally, the performer wears data gloves with strain gauges fixed at the finger joints to measure flexions and extensions. Combining whole-body motion with finger motions forms a representation of integrated motion, which is subsequently encoded into a probabilistic model whose parameters are optimized such that the model most likely generates the training data for the integrated motion. Observations of integrated motion are classified into the relevant probabilistic model with the largest probability of generating the observation. Synchronous measurements of human whole-body and finger motions created a dataset of integrated human motions. We tested our proposed approach on this dataset, thereby demonstrating that representations of whole-body motion integrated with finger motions improved classification of human motions while manipulating objects.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call