Abstract

AbstractA three‐dimensional (3D) visualization of disaster scenes based on mobile virtual reality (VR) can improve the application scenarios and emergency service capabilities of traditional 3D visualization of disaster scenes. Because a smartphone needs to be placed into a mobile head‐mounted display, conventional touch scene interaction cannot be used by mobile VR, and the user's gaze usually serves as the default scene interaction method. However, the existing gaze‐based interaction methods for mobile VR scenes are passive scene interaction methods and cannot meet the basic interaction requirement for actively roaming through and exploring large‐scale and large‐space disaster scenes. Therefore, this study focuses on gaze‐based mobile VR interactions to satisfy the various interaction requirements of large‐scale and large‐space disaster scenes. First, a dynamic user interface (UI) generation method for gaze interaction in large‐scale and large‐space disaster scenes is proposed to solve the problem of the active exploration of mobile VR disaster scenes. Second, disaster scene exploration and disaster information query methods based on a dynamic UI and gaze are proposed. Finally, using a flood disaster as an example, a prototype system and associated experiments are discussed. As indicated by the experimental results, the gaze‐based mobile VR interaction methods addressed in this study can effectively support users in actively roaming through and exploring large‐scale and large‐space disaster scenes, disaster simulation analysis, and the interactive querying of disaster information within mobile VR, making the effective interaction of mobile VR disaster scenes possible.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call