A key question for temporal processing research is how the nervous system extracts event duration, despite a notable lack of neural structures dedicated to duration encoding. This is in stark contrast with the orderly arrangement of neurons tasked with spatial processing. In this study, we examine the linkage between the spatial and temporal domains. We use sensory adaptation techniques to generate after-effects where perceived duration is either compressed or expanded in the opposite direction to the adapting stimulus' duration. Our results indicate that these after-effects are broadly tuned, extending over an area approximately five times the size of the stimulus. This region is directly related to the size of the adapting stimulus—the larger the adapting stimulus the greater the spatial spread of the after-effect. We construct a simple model to test predictions based on overlapping adapted versus non-adapted neuronal populations and show that our effects cannot be explained by any single, fixed-scale neural filtering. Rather, our effects are best explained by a self-scaled mechanism underpinned by duration selective neurons that also pool spatial information across earlier stages of visual processing.