Abstract

Visual stimuli are known to vary in their perceived duration. Some visual stimuli are also known to linger for longer in memory. Yet, whether these two features of visual processing are linked is unknown. Despite early assumptions that time is an extracted or higher-order feature of perception, more recent work over the past two decades has demonstrated that timing may be instantiated within sensory modality circuits. A primary location for many of these studies is the visual system, where duration-sensitive responses have been demonstrated. Furthermore, visual stimulus features have been observed to shift perceived duration. These findings suggest that visual circuits mediate or construct perceived time. Here we present evidence across a series of experiments that perceived time is affected by the image properties of scene size, clutter and memorability. More specifically, we observe that scene size and memorability dilate time, whereas clutter contracts it. Furthermore, the durations of more memorable images are also perceived more precisely. Conversely, the longer the perceived duration of an image, the more memorable it is. To explain these findings, we applied a recurrent convolutional neural network model of the ventral visual system, in which images are progressively processed over time. We find that more memorable images are processed faster, and that this increase in processing speed predicts both the lengthening and the increased precision of perceived durations. These findings provide evidence for a link between image features, time perception and memory that can be further explored with models of visual processing.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call