Event Abstract Back to Event The neural representation of time: An information-theoretical perspective Joachim Hass1, 2* and J. Michael Herrmann3, 4 1 Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg University, Research Group Computational Neuroscience, Germany 2 Bernstein Center for Computational Neuroscience Heidelberg Mannheim, Germany 3 University of Edinburgh, Institute of Perception, Action and Behaviour, School of Informatics, United Kingdom 4 Bernstein Center for Computational Neuroscience Göttingen, Germany Accurate representations of time are crucial for a wide range of brain functions such as speech recognition and the planning and execution of coordinated movements. While the neural basis of these representations remains debated, there is a large body of psychological studies probing the capabilities and limits of time perception. A prominent finding in these experiments is Weber’s law (also called the “scalar property”), the linear scaling of timing errors with duration of the interval that needs to be estimated. The ability to reproduce this scaling has been taken as a criterion for the validity of neurocomputational models of time perception. However, the origin of Weber’s law remains unknown, and currently only a few models generically reproduce it. Here, we use an information-theoretical framework to investigate the statistical origin of Weber’s law in time perception, as well as the frequently observed deviations from this law [1]. We employ general Gaussian random processes as an abstract model for the neural representations of time, with temporal changes in the mean, the variance and the covariance, and use Fisher information to compute the theoretical lower bound of timing errors. Under the assumption that the brain is able to compute optimal estimates of time in this sense, we find that Weber’s law only holds exactly if the estimate is based on temporal changes in the variance of the process. In contrast, the timing errors scale sublinearly with interval duration if the systematic changes in the mean of a process are used for estimation, as is the case in the majority of time perception models. Estimates based on temporal correlations even result in a superlinear scaling, as we exemplify for power-law exponentially decaying correlations. We also relate these three types of estimators to more concrete neurocomputational models of time perception and show that this pattern is consistent with the scaling behaviors that are found in these models. Furthermore, we extend the framework to the case that multiple processes are available for estimation. We address two case studies that illustrate specific issues that arise in the presence of multiple processes. First, we evaluate a previous covariance-based model [2] and show that the minimal timing error it produces scales exponentially and Weber's law is approximated only for relatively short intervals. Second, we show how a neurocomputational model can be formulated as a stochastic process using the example of the synfire chain model [3].Within this framework, we confirm that time perception based on optimal selection of multiple synfire chains also results in a statistical optimal estimate of time, and also approximates Weber's law for somewhat longer intervals. The information-theoretical framework highlights the possibility to estimate time intervals from various sources, including systematic changes of states in the brain, but also the decay of such neuronal signals or signal correlations over time. While most temporal information can be captured when only the systematic changes in the processes are used, psychophysical studies which report Weber's law to hold provide evidence for the notion that information may be conveyed by the variance alone. As neocortical cells have been shown to provide a population rate code with high temporal resolution when working in a variance-driven regime [4], our results suggest that this processing mode is relevant for the perception of time. Acknowledgements This study was supported by a grant from the Bundesministerium für Bildung und Forschung in the framework of the Bernstein Center for Computational Neuroscience Göttingen, grant numbers 01GQ0432 and 01GQ0811.