Medical events can affect space crew health and compromise the success of deep space missions. To successfully manage such events, crew members must be sufficiently prepared to manage certain medical conditions for which they are not technically trained. Extended Reality (XR) can provide an immersive, realistic user experience that, when integrated with augmented clinical tools (ACT), can improve training outcomes and provide real-time guidance during non-routine tasks, diagnostic, and therapeutic procedures. The goal of this study was to develop a framework to guide XR platform development using astronaut medical training and guidance as the domain for illustration. We conducted a mixed-methods study—using video conference meetings (45 subject-matter experts), Delphi panel surveys, and a web-based card sorting application—to develop a standard taxonomy of essential XR capabilities. We augmented this by identifying additional models and taxonomies from related fields. Together, this “taxonomy of taxonomies,” and the essential XR capabilities identified, serve as an initial framework to structure the development of XR-based medical training and guidance for use during deep space exploration missions. We provide a schematic approach, illustrated with a use case, for how this framework and materials generated through this study might be employed.
Read full abstract