Abstract
It has been known for over 30 years that motion information alone is sufficient to yield a vivid impression of three-dimensional object structure. For example, a computer simulation of a transparent sphere, the surface of which is randomly speckled with dots, gives no impression of depth when presented as a stationary pattern on a visual display. As soon as the sphere is made to rotate in a series of discrete steps or frames, its 3-D structure becomes apparent. Three experiments are described which use this stimulus, and find that depth perception in these conditions depends crucially on the spatial and temporal properties of the display: 1. Depth is seen reliably only for between-frame rotations of less than 15°, using two-frame and four-frame sequences. 2. Parametric observations using a wide range of frame durations and inter-frame intervals reveal that depth is seen only for inter-frame intervals below 80 msec and is optimal when the stimulus can be sampled at intervals of about 40–60 msec. 3. Monoptic presentation of two frames of the stimulus is sufficient to yield depth, but the impression is destroyed by dichoptic presentation. These data are in close agreement with the observed limits of direction perception in experiments using “short-range” stimuli. It is concluded that depth perception in the motion display used in these experiments depends on the outputs of low-level or “short-range” motion detectors.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: The Quarterly Journal of Experimental Psychology Section A
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.