Motion provides a powerful sensory cue for segmenting a visual scene into objects and inferring the causal relationships between objects. Fundamental mechanisms involved in this process are the integration and segmentation of local motion signals. However, the computations that govern whether local motion signals are perceptually integrated or segmented remain unclear. Hierarchical Bayesian causal inference has recently been proposed as a model for these computations, yet a hallmark prediction of the model - its dependency on sensory uncertainty - has remained untested. We used a recently developed hierarchical stimulus configuration to measure how human subjects integrate or segment local motion signals while manipulating motion coherence to control sensory uncertainty. We found that (a) the perceptual transition from motion integration to segmentation shifts with sensory uncertainty, and (b) that perceptual variability is maximal around this transition point. Both findings were predicted by the model and challenge conventional interpretations of motion repulsion effects.
Read full abstract