Abstract

The governance of Virtual-Reality Integration, which seamlessly merges the virtual world (metaverse) with the physical reality, represents an emerging approach to addressing perception and comprehension challenges in complex computational environments. Such Virtual-Reality Integration systems have the capability to streamline data analysis complexity, offer real-time visualization, and provide user-centric interaction, thereby delivering crucial support for data analysis and profound decision-making in complex computational settings. In this paper, we introduce a real-time perception and interaction methodology that combines computer vision with Virtual-Reality Integration technology. We employ the Grid-ORB algorithm-based approach for high-precision feature extraction and three-dimensional registration tracking on resource-constrained devices, enabling the perception of physical entities. Furthermore, we utilize the Kriging method, augmented with a drift term, to fill gaps in numerical physical space data, aiding users in observing real-world physical values and trend fluctuations. To facilitate a unified cognitive experience for data and knowledge, we devise a user-centric interaction interface using augmented reality technology. Within this interface, users can interact with charts and controls through methods such as eye movement and gestures. Finally, we validate our system within a real thermodynamics experimental environment, with results demonstrating a significant enhancement in user efficiency for comprehending data and knowledge within complex environments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call