Indoor search and rescue (SAR) missions frequently encounter challenges due to complex spatial layouts and conditions of high stress. Augmented Reality (AR) has emerged as a promising tool to aid SAR efforts by offering spatial information through exocentric and egocentric perspectives. Specifically, AR systems introduce an egocentric perspective, overlaying detailed spatial information onto physical environments, which holds promise for navigating the complexities of indoor SAR operations. However, the effectiveness of these AR perspectives in a team-based SAR context remains underexplored. This study introduces a multi-user AR platform that combines reality capture with synchronized real-time spatial data and real-time cognitive load monitoring module to support team-based SAR missions. A human subject experiment (N = 64) was conducted with combined variations of spatial complexity and collaboration strategies to explore the efficacy of egocentric and exocentric navigation cues in aiding multi-agent SAR tasks. In the experiment, participants were asked to search for two consecutive targets while collaborating either parallelly, or hierarchically within a given time limit. Results indicated that egocentric cues offered limited benefits in a multi-agent setup, whereas exocentric cues notably increased team performance. Hierarchically structured teams resulted in fewer errors and improved efficiency during wayfinding tasks. Cognitive analysis revealed the importance of optimizing collaborative strategies to manage the mental demands of SAR missions instead of focusing on optimizing individuals. These findings contribute to understanding AR interface design’s complex role in multi-agent SAR support, aiming to improve AR system designs for SAR scenarios.
Read full abstract