Abstract

Faces are processed by a distributed neural system in the visual as well as in the non-visual cortex [the “core” and the “extended” systems, J.V. Haxby, E.A. Hoffman, M.I. Gobbini, The distributed human neural system for face perception, Trends Cogn. Sci. 4 (2000) 223–233]. Yet, the functions of the different brain regions included in the face processing system are far from clear. On the basis of the case study of a patient unable to recognize fearful faces, Adolphs et al. [R. Adolphs, F. Gosselin, T.W. Buchanan, D. Tranel, P. Schyns, A.R. Damasio, A mechanism for impaired fear recognition after amygdala damage, Nature 433 (2005) 68–72] suggested that the amygdala might play a role in orienting attention towards the eyes, i.e. towards the region of face conveying most information about fear.In a functional magnetic resonance (fMRI) study comparing patterns of activation during observation of whole faces and parts of faces displaying neutral expressions, we evaluated the neural systems for face processing when only partial information is provided, as well as those involved in processing two socially relevant facial areas (the eyes and the mouth).Twenty-four subjects were asked to perform a gender decision task on pictures showing whole faces, upper faces (eyes and eyebrows), and lower faces (mouth). Our results showed that the amygdala was activated more in response to the whole faces than to parts of faces, indicating that the amygdala is involved in orienting attention toward eye and mouth. Processing of parts of faces in isolation was found to activate other regions within both the “core” and the “extended” systems, as well as structures outside this network, thus suggesting that these structures are involved in building up the representation of the whole face from its parts.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call