Abstract

This paper introduces a real-time model-based human motion tracking and analysis method for human computer interface (HCI). This method tracks and analyzes the human motion from two orthogonal views without using any markers. The motion parameters are estimated by pattern matching between the extracted human silhouette and the human model. First, the human silhouette is extracted and then the body definition parameters (BDPs) can be obtained. Second, the body animation parameters (BAPs) are estimated by a hierarchical tritree overlapping searching algorithm. To verify the performance of our method, we demonstrate different human posture sequences and use hidden Markov model (HMM) for posture recognition testing.

Highlights

  • Human motion tracking and analysis has a lot of applications, such as surveillance systems and human computer interface (HCI) systems

  • The representation of the human body has been developed from stick figures [1, 2], 2D contour [3, 4], and 3D volumes [5, 6] with increasing complexity of the model

  • The body animation parameters (BAPs) estimation algorithm may fail if the extracted foreground object is noisy or ambiguous, which is caused by the occlusion between the limbs and the torso

Read more

Summary

INTRODUCTION

Human motion tracking and analysis has a lot of applications, such as surveillance systems and human computer interface (HCI) systems. The abstract levels for comparing image data and synthesis data can be edges, silhouettes, contours, sticks, joints, blobs, texture, motion, and so forth Another HCI system called “video avatar” [9] has been developed, which allows a real human actor to be transferred to another site and integrated with a virtual world. In [19], an interesting approach for detecting and tracking human motion has been proposed, which calculates a best global labeling of point features using a learned triangular decomposition of the human body Another realtime human posture estimation system [20] uses trinocular images and a simple 2D operation to find the significant points of human silhouette and reconstruct the 3D positions of human object from the corresponding significant points. Each viewer estimates the BAPs individually, which are combined as the final universal BAPs

HUMAN MODEL GENERATION
Homogeneous coordinate transformation
A Real-Time Model-Based Human Motion Tracking and Analysis
Similarity measurement
BDPs determination
MOTION PARAMETERS ESTIMATION
Object tracking
Arm joint angle estimation
Overlapped tritree hierarchical search algorithm
Camera calibration
Perspective scaling factor determination
Body animation parameter integration
EXPERIMENTAL RESULTS
Training phase
Recognition phase
CONCLUSION AND FUTURE WORKS
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call