Abstract
Previous investigations concluded that the human brain’s information processing rate remains fundamentally constant, irrespective of task demands. However, their conclusion rested in analyses of simple discrete-choice tasks. The present contribution recasts the question of human information rate within the context of visuomotor tasks, which provides a more ecologically relevant arena, albeit a more complex one. We argue that, while predictable aspects of inputs can be encoded virtually free of charge, real-time information transfer should be identified with the processing of surprises. We formalise this intuition by deriving from first principles a decomposition of the total information shared by inputs and outputs into a feedforward, predictive component and a feedback, error-correcting component. We find that the information measured by the feedback component, a proxy for the brain’s information processing rate, scales with the difficulty of the task at hand, in agreement with cost-benefit models of cognitive effort.
Highlights
Our living environment is rich with stimuli, some of which are crucial in guiding our decisions
The present study aims at filling this gap by applying information theoretic measures to a visuomotor tracking task
We hypothesize that the feedback component is a better marker of the amount of cognitive resources required by the task than the total mutual information
Summary
Our living environment is rich with stimuli, some of which are crucial in guiding our decisions. Our brain deals with this overwhelming computational demand by selecting information through attentional mechanisms [1] and by using efficient coding, explaining away predictable data and transmitting only the prediction error, that is the sensory evidence that cannot be predicted from other sources or earlier inputs [2] According to this view, cognitive resources (e.g., metabolic rate of neurons or information capacity usage) would be dedicated to processing surprising inputs while predictable data would be virtually free to encode [3]. In order to quantify the dynamics, or causality, of the relationship between multiple random processes, one must consider transition, rather than static, probabilities, which leads to the definition of entropy rate (for a single variable) and transfer entropy (for the interaction of two systems) [15]
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have