When lifting an object, the brain uses visual cues and an internal object representation to predict its weight and scale fingertip forces accordingly. Once available, tactile information is rapidly integrated to update the weight prediction and refine the internal object representation. If visual cues cannot be used to predict weight, force planning relies on implicit knowledge acquired from recent lifting experience, termed sensorimotor memory. Here, we investigated whether perception of weight is similarly biased according to previous lifting experience and how this is related to force scaling. Participants grasped and lifted series of light or heavy objects in a semi-randomized order and estimated their weights. As expected, we found that forces were scaled based on previous lifts (sensorimotor memory) and these effects increased depending on the length of recent lifting experience. Importantly, perceptual weight estimates were also influenced by the preceding lift, resulting in lower estimations after a heavy lift compared to a light one. In addition, weight estimations were negatively correlated with the magnitude of planned force parameters. This perceptual bias was only found if the current lift was light, but not heavy since the magnitude of sensorimotor memory effects had, according to Weber’s law, relatively less impact on heavy compared to light objects. A control experiment tested the importance of active lifting in mediating these perceptual changes and showed that when weights are passively applied on the hand, no effect of previous sensory experience is found on perception. These results highlight how fast learning of novel object lifting dynamics can shape weight perception and demonstrate a tight link between action planning and perception control. If predictive force scaling and actual object weight do not match, the online motor corrections, rapidly implemented to downscale forces, will also downscale weight estimation in a proportional manner.
Read full abstract