Abstract

We present a new motion tracking technique to robustly reconstruct non-rigid geometries and motions from a single view depth input recorded by a consumer depth sensor. The idea is based on the observation that most non-rigid motions (especially human-related motions) are intrinsically involved in articulate motion subspace. To take this advantage, we propose a novel based motion regularizer with an iterative solver that implicitly constrains local deformations with articulate structures, leading to reduced solution space and physical plausible deformations. The strategy is integrated into the available non-rigid motion tracking pipeline, and gradually extracts articulate joints information online with the tracking, which corrects the tracking errors in the results. The information of the articulate joints is used in the following tracking procedure to further improve the tracking accuracy and prevent tracking failures. Extensive experiments over complex human body motions with occlusions, facial and hand motions demonstrate that our approach substantially improves the robustness and accuracy in motion tracking.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call