Abstract

When faces appear in our visual environment we naturally attend to them, possibly to the detriment of other visual information. Evidence from behavioural studies suggests that faces capture attention because they are more salient than other types of visual stimuli, reflecting a category-dependent modulation of attention. By contrast, neuroimaging data has led to a domain-specific account of face perception that rules out the direct contribution of attention, suggesting a dedicated neural network for face perception. Here we sought to dissociate effects of attention from categorical perception using Event Related Potentials. Participants viewed physically matched face and butterfly images, with each category acting as a target stimulus during different blocks in an oddball paradigm. Using a data-driven approach based on functional microstates, we show that the locus of endogenous attention effects with ERPs occurs in the N1 time range. Earlier categorical effects were also found around the level of the P1, reflecting either an exogenous increase in attention towards face stimuli, or a putative face-selective measure. Both category and attention effects were dissociable from one another hinting at the role that faces may play in early capturing of attention before top-down control of attention is observed. Our data support the conclusion that certain object categories, in this experiment, faces, may capture attention before top-down voluntary control of attention is initiated.

Highlights

  • We are able to recognise objects with only a momentary glance around our visual environment and some of these objects will capture our attention more than others

  • There was an interaction of deviant by category [F(1,18) = 6.507, p < .05, ήp2 = 0.255] such that reaction times for faces were slightly quicker than butterflies when they were targets, but slower when they were non-targets

  • We aimed to test if Event Related Potentials (ERPs) correlates of face-perception were impervious to effects of attention, which could support a domain-specific view of face perception, or if attention was a driving doi:10.1371/journal.pone.0163336.g006

Read more

Summary

Introduction

We are able to recognise objects with only a momentary glance around our visual environment and some of these objects will capture our attention more than others This capture of attention is guided by both bottom-up structural analysis of images and a top-down control of attention suggesting that, in a particular context, one stimulus can become most salient, for example, noticing a fire alarm in a corridor only in the event of your office burning down. In visual search paradigms when they are not the explicit target [2] These behavioural findings appear in contrast with some evidence from neuroimaging which suggests that effects of attention do not modulate early domain-specific processes in the perception of faces [3, 4]. We use Event Related Potentials (ERPs) to identify the locus of attention within face perception, asking whether face-specific processes operate entirely independently of attention

Objectives
Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.