Abstract

In pursuit of an ultimately realistic human‐to‐human telecommunication technology, the ability to auditorily perceive the facing direction of a human speaker was explored. A male speaker sat on a pivot chair in an anechoic chamber and spoke a short sentence (about 5 seconds) while facing either of eight azimuth angles (0=listener's direction, 45, 90, 135, 180, 225, 270, or 315 degrees) and either of three elevation angles (0=horizontal direction, ‐45, or 45 degrees). The azimuth angles were set solely by turning the pivot chair. Twelve blindfolded listeners heard the spoken sentence at a distance of either 1.2 or 2.4 meters from the speaker and were asked to indicate the speaker's facing angle. In separate sessions, the speaker changed facing angles while speaking and the listeners indicated the perceived direction of horizontal movement (clockwise or counter‐clockwise) or vertical movement (up or down). Overall results showed that the listeners were more accurate in indicating the movement of the speaker than in simply indicating the facing angles. Effective acoustic cues were then discussed on the basis of the transfer characteristics from the speaker's mouth to the listener's ears measured by the cross‐spectral method using the speaker's own voice.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call