Abstract

Visual selective attention acts as a filter on perceptual information, facilitating learning and inference about important events in an agent’s environment. A role for visual attention in reward-based decisions has previously been demonstrated, but it remains unclear how visual attention is recruited during aversive learning, particularly when learning about multiple stimuli concurrently. This question is of particular importance in psychopathology, where enhanced attention to threat is a putative feature of pathological anxiety. Using an aversive reversal learning task that required subjects to learn, and exploit, predictions about multiple stimuli, we show that the allocation of visual attention is influenced significantly by aversive value but not by uncertainty. Moreover, this relationship is bidirectional in that attention biases value updates for attended stimuli, resulting in heightened value estimates. Our findings have implications for understanding biased attention in psychopathology and support a role for learning in the expression of threat-related attentional biases in anxiety.

Highlights

  • To enable efficient learning and inference about the environment, perceptual inputs need to be prioritised appropriately

  • We investigated how value and uncertainty influence visual attention during aversive learning

  • The findings have implications for understanding the development of pathological threat-related attentional biases that are a feature of anxiety disorders

Read more

Summary

Introduction

To enable efficient learning and inference about the environment, perceptual inputs need to be prioritised appropriately. Attention acts as a perceptual filter in sensory processing [1,2], learning [3,4,5], and inference [6,7], with various statistical [8,9] and computational [10,11]. Max Planck UCL Centre is a joint initiative supported by UCL and the Max Planck Society. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call