Abstract
The face conveys information about a person's age, sex, background, and identity; what they are feeling, thinking, or likely to do next. Facial expression regulates face-to-face interactions, indicates reciprocity and interpersonal attraction or repulsion, and enables intersubjectivity between members of different cultures. Facial expression indexes neurological and psychiatric functioning and reveals personality and socioemotional development. Not surprisingly, the face has been of keen interest to behavioral scientists. About 15 years ago, computer scientists became increasingly interested in the use of computer vision and graphics to automatically analyze and synthesize facial expression. This effort was made possible in part by the development in psychology of detailed coding systems for describing facial actions and their relation to basic emotions, that is, emotions that are interpreted similarly in diverse cultures. The most detailed of these systems, the Facial Action Coding System (FACS), informed the development of the MPEG-4 facial animation parameters for video transmission and enabled progress toward automated measurement and synthesis of facial actions for research in affective computing, social signal processing, and behavioral science. This article reports key advances in behavioral science that are becoming possible through these developments. Before beginning, automated facial image analysis and synthesis (AFAS) is briefly described.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.