Abstract

In automatic user profiling a number of features related to the user of a system are extracted in an attempt to deduct key information that can be used for adapting the interface and content of computer applications. In this context the identity, emotion, age and head orientation can provide important cues that enable efficient customization of the content of an application. In order to develop automated user profiling applications facial features are often used for providing the required information related to the user. A key issue that arises is the applicability of different facial features for different user profiling tasks. In this paper we present a generalized framework that can be used for quantifying the invariance of different facial features for different classification tasks assisting in that way the implementation of efficient adaptive user profiling in computer applications. Preliminary experimental results demonstrate the potential of the proposed method in selecting the most useful features for different tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call