The body's geometrical relationship with the terrain is important for depth perception of human and non-human terrestrial animals. Static human observers in the dark employ the brain's internal model of the terrain, the intrinsic bias, to represent the ground as an allocentric reference frame for coding distance. However, it is unknown if the same ground-based coding process operates when observers walk in a cue-impoverished environment with visible ground surface. We explored this by measuring human observers' perceived locations of dimly-lit targets after a short walk in the dark from the home-base location. We found the intrinsic bias was kept at the home-base location and not the destination-location after walking, causing distance underestimation, fitting its allocentric nature. We then measured perceived distance of dimly-lit targets from the destination-location when there were visual depth cues on the floor. We found judged locations of targets on the floor transcribed a slanted surface shifted towards the home-base location, indicating distance underestimation. This suggests, in dynamically translating observers, the brain integrates the allocentric intrinsic bias with visual depth cues to construct an allocentric ground reference frame. More broadly, our findings underscore the dynamic interaction between the internal model of the ground and external depth cues.
Read full abstract