Abstract

A key question for temporal processing research is how the nervous system extracts event duration, despite a notable lack of neural structures dedicated to duration encoding. This is in stark contrast with the orderly arrangement of neurons tasked with spatial processing. In this study, we examine the linkage between the spatial and temporal domains. We use sensory adaptation techniques to generate after-effects where perceived duration is either compressed or expanded in the opposite direction to the adapting stimulus' duration. Our results indicate that these after-effects are broadly tuned, extending over an area approximately five times the size of the stimulus. This region is directly related to the size of the adapting stimulus—the larger the adapting stimulus the greater the spatial spread of the after-effect. We construct a simple model to test predictions based on overlapping adapted versus non-adapted neuronal populations and show that our effects cannot be explained by any single, fixed-scale neural filtering. Rather, our effects are best explained by a self-scaled mechanism underpinned by duration selective neurons that also pool spatial information across earlier stages of visual processing.

Highlights

  • Sub-second timing information is critical to the accuracy of most sensory and motor processing, human receptor surfaces do not appear to encode time directly in the way they initiate the analysis of non-temporal features such as pitch, location or temperature

  • Further examples of sensory-specificity have been revealed by adaptation experiments where exposure to consistent duration information leads to a ‘duration after-effect’ (DAE): adaptation to relatively short/long auditory or visual durations induces perceptual expansion/compression of subsequently viewed/heard intermediate duration stimuli

  • We propose DAEs to be a signature of mid-level visual neurons that pool spatial information across proportionally smaller lower-level inputs

Read more

Summary

Object size determines the spatial spread of visual time

A key question for temporal processing research is how the nervous system extracts event duration, despite a notable lack of neural structures dedicated to duration encoding. This is in stark contrast with the orderly arrangement of neurons tasked with spatial processing. Our results indicate that these after-effects are broadly tuned, extending over an area approximately five times the size of the stimulus. This region is directly related to the size of the adapting stimulus—the larger the adapting stimulus the greater the spatial spread of the aftereffect. Our effects are best explained by a self-scaled mechanism underpinned by duration selective neurons that pool spatial information across earlier stages of visual processing

Introduction
Results
Discussion
Decoding of temporal intervals from cortical
Local inhibition shapes duration tuning in the
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call