Abstract

AbstractVirtual reality (VR) experimental platforms are frequently utilized in the teaching of chemical experiments because of its high level of interest, lifelike effect, and safe and dependable advantages. However, users are unable to operate genuine experimental equipment on the present VR experimental platform, and these systems are unable to precisely assess and manage users’ operation intentions. In light of the aforementioned concerns, this paper develops a smart glove with cognitive capabilities, which is composed of a variety of simple commercial sensors and binocular cameras. The smart glove can be applied to the virtual‐reality fusion chemical experiment platform established in this paper. Additionally, a navigation interaction algorithm based on multimodal intention understanding is also proposed in this study (hereinafter referred to as NIAMIU algorithm). The smart glove can accurately and efficiently obtain the user's operation intention through the algorithm. The experiment showcased that the smart glove detailed in this paper is capable of precisely determining the position of users’ hands in space and guiding and rectifying the user's interactive behavior. Meanwhile, users may converse with the smart glove in an unsupervised setting and conclude the experiment by following the directions supplied by the smart glove. In comparison to a conventional data glove, the smart glove designed in this paper considerably augments the accuracy and efficiency of human‐computer interaction.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call