Abstract

A real-time computer vision system is described for tracking hands thus enabling behavioural events to be interpreted. Forearms are tracked to provide structural context, enabling mutual occlusion, which occurs when hands cross one another, to be handled robustly. No prior skin colour models are used. Instead adaptive appearance models are learned on-line. A contour distance transform is used to control model adaptation and to fit 2D geometric models robustly. Hands can be tracked whether clothed or unclothed. Results are given for a ‘smart desk’ and an in-vehicle application. The ability to interpret behavioural events of interest when tracking a vehicle driver's hands is described.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call