Most activities of daily living require the dynamic integration of sights, sounds, and movements as people navigate complex environments. Nevertheless, little is known about the effects of hearing loss (HL) or hearing aid (HA) use on listening during multitasking challenges. The objective of the current study was to investigate the effect of age-related hearing loss (ARHL) on word recognition accuracy in a dual-task experiment. Virtual reality (VR) technologies in a specialized laboratory (Challenging Environment Assessment Laboratory) were used to produce a controlled and safe simulated environment for listening while walking. In a simulation of a downtown street intersection, participants completed two single-task conditions, listening-only (standing stationary) and walking-only (walking on a treadmill to cross the simulated intersection with no speech presented), and a dual-task condition (listening while walking). For the listening task, they were required to recognize words spoken by a target talker when there was a competing talker. For some blocks of trials, the target talker was always located at 0° azimuth (100% probability condition); for other blocks, the target talker was more likely (60% of trials) to be located at the center (0° azimuth) and less likely (40% of trials) to be located at the left (270° azimuth). The participants were eight older adults with bilateral HL (mean age = 73.3 yr, standard deviation [SD] = 8.4; three males) who wore their own HAs during testing and eight controls with normal hearing (NH) thresholds (mean age = 69.9 yr, SD = 5.4; two males). No participant had clinically significant visual, cognitive, or mobility impairments. Word recognition accuracy and kinematic parameters (head and trunk angles, step width and length, stride time, cadence) were analyzed using mixed factorial analysis of variances with group as a between-subjects factor. Task condition (single versus dual) and probability (100% versus 60%) were within-subject factors. In analyses of the 60% listening condition, spatial expectation (likely versus unlikely) was a within-subject factor. Differences between groups in age and baseline measures of hearing, mobility, and cognition were tested using t tests. The NH group had significantly better word recognition accuracy than the HL group. Both groups performed better when the probability was higher and the target location more likely. For word recognition, dual-task costs for the HL group did not depend on condition, whereas the NH group demonstrated a surprising dual-task benefit in conditions with lower probability or spatial expectation. For the kinematic parameters, both groups demonstrated a more upright and less variable head position and more variable trunk position during dual-task conditions compared to the walking-only condition, suggesting that safe walking was prioritized. The HL group demonstrated more overall stride time variability than the NH group. This study provides new knowledge about the effects of ARHL, HA use, and aging on word recognition when individuals also perform a mobility-related task that is typically experienced in everyday life. This research may help inform the development of more effective function-based approaches to assessment and intervention for people who are hard-of-hearing.
Read full abstract