Abstract

Non-permanent facial makeup is one of the most difficult problems inhibiting face recognition systems in security applications. In this paper, a new method is proposed for makeup-invariant face identification and verification. Face images from the virtual makeup (VMU) and YouTubemakeup (YMU) datasets were subjected to the Gabor filtering and histogram of oriented gradients (HOG) methods for feature extraction. The Gabor and HOG features were concatenated to generate the final feature vectors and subsequently reduced using the fisher linear discriminant analysis subspace. The reduced features were classified using the city block distance (CBD), Euclidean distance (EUC), cosine similarity measure (CSM) and whitened cosine similarity measure (WCSM). The CSM achieved the best recognition rates out of the four metrics used. Performance evaluation of these metrics produced identification and verification rates of 100% and 100% for the VMU database, and 72.52% and 79.47% for the YMU database, respectively. The developed method outperformed several state-of-the-art methods initially exploited.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call