This study investigates the optical information for visual event perception. Events are objects in motion, with properties like shape, weight and surface material influencing the dynamics that shape movements and optics. The progressive transformation of visible textures, known as visual kinaesthetic information, specifies movements and objects. Four experiments tested whether events could be perceived using only visual kinaesthetic information. Participants identified their own walking from point-light displays (Experiment 1), from simulated environmental texture transformations as a result of their walking (Experiment 2), and from videos shot by a head-mounted camera during outdoor walking (Experiment 3); and distinguishing strangers from footages captured by their head-mounted cameras (Experiment 4). In Experiments 2-4, the displays did not resemble the outline of a person or look like walking but revealed the physical relations between the walker and the environment as a result of their movement. Regardless, participants were able to recognize themselves and distinguish strangers. Thus, observers are able to perceive events using visual kinaesthetic information that stems from dynamics. The one-to-one correspondences between object property, dynamics, kinematics and optical information are governed by the laws of physics, and unaffected by the event's appearance or viewing perspectives.
Read full abstract