Abstract

Auditory localization of spatial sound sources is an important life skill for human beings. For the practical application-oriented measurement of auditory localization ability, the preference is a compromise among (i) data accuracy, (ii) the maneuverability of collecting directions, and (iii) the cost of hardware and software. The graphical user interface (GUI)-based sound-localization experimental platform proposed here (i) is cheap, (ii) can be operated autonomously by the listener, (iii) can store results online, and (iv) supports real or virtual sound sources. To evaluate the accuracy of this method, by using 12 loudspeakers arranged in equal azimuthal intervals of 30° in the horizontal plane, three groups of azimuthal localization experiments are conducted in the horizontal plane with subjects with normal hearing. In these experiments, the azimuths are reported using (i) an assistant, (ii) a motion tracker, or (iii) the newly designed GUI-based method. All three groups of results show that the localization errors are mostly within 5-12°, which is consistent with previous results from different localization experiments. Finally, the stimulus of virtual sound sources is integrated into the GUI-based experimental platform. The results with the virtual sources suggest that using individualized head-related transfer functions can achieve better performance in spatial sound source localization, which is consistent with previous conclusions and further validates the reliability of this experimental platform.

Highlights

  • S OUND-SOURCE localization ability in hearing is very important in our lives, allowing us to locate the sources of sound in a timely manner [1], [2]

  • On the accuracy of localization, interactive visual and auditory stimuli are better than auditory stimuli alone [5], [6]; stimulation with real sound sources is better than stimulation with virtual sound sources [7]; a spatially sparse distribution of the stimulus signal is better than a dense distribution; and noise containing low-frequency components is easier to locate because the interaural time difference (ITD), an important localization factor, is effective only at low frequencies [8], [9]

  • To further evaluate the performance of the 2D graphical user interface (GUI)-based collecting system integrated with the static virtual auditory display (VAD), we recruited eight subjects aged between 23 and 35 (25.8 ± 4.0) years to participate in a localization experiment

Read more

Summary

A Graphical-User-Interface-Based Azimuth-Collection Method in Autonomous

Abstract—Auditory localization of spatial sound sources is an important life skill for human beings. The graphical user interface (GUI)-based sound-localization experimental platform proposed here (i) is cheap, (ii) can be operated autonomously by the listener, (iii) can store results online, and (iv) supports real or virtual sound sources. To evaluate the accuracy of this method, by using 12 loudspeakers arranged in equal azimuthal intervals of 30° in the horizontal plane, three groups of azimuthal localization experiments are conducted in the horizontal plane with subjects with normal hearing. In these experiments, the azimuths are reported using (i) an assistant, (ii) a motion tracker, or (iii) the newly designed GUI-based method.

INTRODUCTION
METHODS
Subjects
Experimental Environment
Sources and Configuration
GUI-Based System
DATA ANALYSIS
Comparison of Three Pointing Methods
Findings
DISCUSSION AND CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call