Within the gaming and electronics industry, there is a continuous evolution of alternative applications. Nevertheless, accessibility to video games remains a persistent hurdle for individuals with disabilities, especially those with visual impairments due to the inherent visual-oriented design of games. Audio games (AGs) are electronic games that rely primarily on auditory cues instead of visual interfaces. This study focuses on the creation of a virtual reality AG for cell phones that integrates natural head and torso movements involved in spatial hearing. Its assessment encompasses user experience, interface usability, and sound localization performance. The study engaged eighteen sighted participants in a pre-post test with a control group. The experimental group underwent 7 training sessions with the AG. Via interviews, facets of the gaming experience were explored, while horizontal plane sound source localization was also tested before and after the training. The results enabled the characterization of sensations related to the use of the game and the interaction with the interfaces. Sound localization tests demonstrated distinct enhancements in performance among trained participants, varying with assessed stimuli. These promising results show advances for future virtual AGs, presenting prospects for auditory training. These innovations hold potential for skill development, entertainment, and the integration of visually impaired individuals.
Read full abstract