To configure our limbs in space, the brain must compute their position based on sensory information provided by mechanoreceptors in the skin, muscles, and joints. Because this information is corrupted by noise, the brain is thought to process it probabilistically and integrate it with prior belief about arm posture, following Bayes' rule. Here, we combined computational modeling with behavioral experimentation to test this hypothesis. The model conceives the perception of arm posture as the combination of a probabilistic kinematic chain composed by the shoulder, elbow, and wrist angles, compromised with additive Gaussian noise, with a Gaussian prior about these joint angles. We tested whether the model explains errors in a virtual reality (VR)-based posture matching task better than a model that assumes a uniform prior. Human participants (N = 20) were required to align their unseen right arm to a target posture, presented as a visual configuration of the arm in the horizontal plane. Results show idiosyncratic biases in how participants matched their unseen arm to the target posture. We used maximum likelihood estimation to fit the Bayesian model to these observations and estimate key parameters including the prior means and its variance-covariance structure. The Bayesian model including a Gaussian prior explained the response biases and variance much better than a model with a uniform prior. The prior varied across participants, consistent with the idiosyncrasies in arm posture perception and in alignment with previous behavioral research. Our work clarifies the biases in arm posture perception within a new perspective on the nature of proprioceptive computations.NEW & NOTEWORTHY We modeled the perception of arm posture as a Bayesian computation. A VR posture-matching task was used to empirically test this Bayesian model. The Bayesian model including a nonuniform postural prior well explained individual participants' biases in arm posture matching.
Read full abstract