Abstract

We present an efficient face tracking method that is robust, especially when the face is turned away from the frontal view to the side view (out-of-plane rotation). The proposed method, consisting of three steps, utilizes the horizontal and vertical projection histograms of a face region to model the visual appearance of the face. The vertical and horizontal positions of a face are sequentially determined. First, the horizontal projection histogram of each potential face region in the current frame near the corresponding face region in the previous frame is used as an input to a back-propagation neural network (BPNN) to reliably estimate the vertical position of the face, based on the observation that the distribution of the horizontal projection histogram of a face region is stable, even when the head is rotating about an axis parallel to the vertical axis of an image plane. Second, the vertical projection histogram of an eye region derived from the estimated vertical position of the face as well as output values from the BPNN is used to estimate the horizontal position of the face. Third, the detected face region is refined by head boundary detection. These three steps are repeatedly applied to track faces in each shot of an input video. Experimental results are provided to demonstrate the efficiency of the proposed method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call