Abstract
The past few years have seen a marked increase in our understanding of how faces are represented in the brain, with the discovery of new anatomical structures and new algorithms for representing faces. Still, the basic computational mechanism used by the primate visual system to identify faces remains a topic of intense debate. Are faces represented by matching to a set of stored exemplars, or by measuring the distance from a standard prototype along a set of different axes? A recent article by Rhodes and Jeffery provides compelling psychophysical evidence in favor of the latter 'axis' model.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have