Developments of head-coupled control/display systems have focused primarily on the display of three dimensional visual information, as the visual system is the optimal sensory channel for the aquisition of spatial information in humans. The auditory system improves the efficiency of vision, however, by obtaining spatial information about relevant objects outside of the visual field of view. This auditory information can be used to direct head and eye movements. Head-coupled display systems, can also benefit from the addition of auditory spatial information, as it provides a natural method of signaling the location of important events outside of the visual field of view. This symposium will report on current efforts in the developments of head-coupled display systems, with an emphasis on the auditory spatial component. The first paper “Virtual Interface Environment Workstations”, by Scott S. Fisher, will report on the development of a prototype virtual environment. This environment consists of a head-mounted, wide-angle, stereoscopic display system which is controlled by operator position, voice, and gesture. With this interface, an operator can virtually explore a 360 degree synthesized environment, and viscerally interact with its components. The second paper, “A Virtual Display System For Conveying Three-Dimensional Acoustic Information” by Elizabeth M. Wenzel, Frederic L. Wightman and Scott H. Foster, will report on the development of a method of synthetically generating three-dimensional sound cues for the above-mentioned interface. The development of simulated auditory spatial cues is limited to some extent, by our knowlege of auditory spatial processing. The remaining papers will report on two areas of auditory space perception that have recieved little attention until recently. “Perception of Real and Simulated Motion in the Auditory Modality”, by Thomas Z. Strybel, will review recent research on auditory motion perception, because a natural acoustic environment must contain moving sounds. This review will consider applications of this knowledge to head-coupled display systems. The last paper, “Auditory Psychomotor Coordination”, will examine the interplay between the auditory, visual and motor systems. The specific emphasis of this paper is the use of auditory spatial information in the regulation of motor responses so as to provide efficient application of the visual channel.
Read full abstract