Abstract

Emergency response teams must coordinate their actions and goals in dynamic and demanding situations to ensure patient health and safety. It is a major challenge to measure and model such dynamic phenomenon under real world conditions. However, measuring coordination in real world conditions is necessary to better understand how teams function and to enhance their individual situation awareness, coordination, and learning. The use of sensor-based and video-based behavioral measurement of coordination within and between teams has been a suggested method for continuously recording dynamic, physical behavioral data streams related to proximity, movement or positioning to inform the team. However, very few studies have attempted to implement unobtrusive sensor-based data collection, especially in messy and challenging conditions. If we are to truly embrace the dynamic and complex nature of team-based behaviors and learning with new forms of data collection such as unobtrusive sensors, digital video and audio, we need to better understand the reality of the difficulties of collecting, processing and integrating these multiple digital data streams in real time to generate meaning in demanding, continually evolving, real world contexts such as emergency response. In the current paper, we strive to advance our understanding of the ground truth of leveraging sensor-based and other digital data streams to identify valid constructs, uncover meaningful indicators, and target relevant combinations of digital data streams to visualize and improve understanding of how teams coordinate and learn in team environments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call