Abstract

Emotional expressions are an evolutionarily conserved means of social communication. It is well-established that emotional stimuli capture attention more easily than neutral stimuli, a phenomenon often referred to as “emotional attention”. However, contradictory findings of the “threat superiority effect” and the “happy face advantage” provoke questions about how emotional valence biases attentional capture. In addition, an enduring attentional bias toward threat is commonly observed in highly anxious individuals, but the exact type of anxiety—state or trait—that modulates the attentional bias for threat has not been agreed upon. To understand how the valence of emotional faces affects attentional capture and how the type of anxiety modulates emotional attention, we preregistered and conducted an online visual search experiment via Amazon’s Mechanical Turk. Participants (n=154) searched for a unique emotional face (happy or angry) among three or seven distractor faces that were closely cropped and balanced for low-level image statistics. Consistent with the “happy face advantage”, we found an attentional bias for positive over negative valence in visual search, robust at both set sizes, in accuracy, response time, and inverse efficiency score. We also found that the anxiety ratings from the State-Trait Anxiety Inventory (STAI)—both state and trait anxiety—correlated positively with search accuracy for angry targets, but negatively with search accuracy for happy targets. Given these opposing patterns of correlation, we further computed a valence index of emotional attention as a difference in search accuracy between angry and happy targets. This valence index correlated strongly with anxiety, suggesting that participants with higher anxiety showed more threat-related attentional bias and less attentional bias to stimuli with positive valence. Together, our findings reveal the role of emotional valence in attentional capture and unveil a distinct impact of anxiety on attention to positive and threatening stimuli.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call