Abstract

Virtual experiments have become an interesting research topic in the field of education. However, we found that there are some limitations in the current virtual experiments: first, the researchers used the virtual effects of the simulation to represent the virtual experiments, which led to decrease the immersion of the user's simulated experiments; second, most of the virtual experiments are only mouse or touch screen interactive mode, which reduces the realism of user simulation experiments; third, students independently explore the experimental operation process and spend too much time simulating the experiment, which leads to problems such as overloading the operation and low interaction efficiency. In order to solve the above problems, we propose and implement a multimodal navigational interaction virtual and real fusion chemistry laboratory (MNIVRFCL). We design a new sensing structure intelligent equipment and propose a multimodal navigational interaction algorithm (MMNI) based on auditory and tactile channel, which are verified and applied in MNIVRFCL. The MMNI algorithm can detect user's specific behaviors to understand their behavioral intentions, and then the system guide and rectify users' current correct or incorrect behaviors in the form of voice navigation broadcasts. Finally, we achieve the purpose that students can use virtual and real fusion interactions through tactile and auditory channels, they can independently complete simulations and learning experiments based on experimental navigation. The experimental statistic results show that system's successful understanding of intention is 91.48%, and prove the MNIVRFCL operational load reduce by 23.81% compared to the pure virtual experiment, it can reduce the time consumption and improves the students' interaction efficiency.

Highlights

  • Artificial Intelligence (AI) technology has been successfully applied to the field of Human-computer Interaction (HCI), and has gradually changed our lifestyle

  • This paper proposes an overall framework for the multimodal navigational interaction paradigm for virtual and real fusion experiments

  • In the SEI2 and SEI5, the evaluation of the MNIVRFCL is significantly higher than the NOBOOK, which is 42% and 36% higher, it shown that the operator is not familiar with the virtual experiment environment, he does not need to waste too much time in the process of doing the experiment in MNIVRFCL

Read more

Summary

INTRODUCTION

Artificial Intelligence (AI) technology has been successfully applied to the field of Human-computer Interaction (HCI), and has gradually changed our lifestyle. This article provides an interactive equipment of virtual pouring experiments for middle school chemistry It includes designing new sensing structures and sensing methods on intelligent equipment, which can be combined with speech and sensors to percept user behaviors. Such a new type of experiment kit is a smarter experimental tool, and need to meet the following characteristics: First, it must have relatively strong interaction capabilities and can integrate and process the information entered by the user in different modes The first this paper discusses how to design and realize an interactive equipment and interaction method to enhance user’s immersion and realism, the second this paper propose a MMNI algorithm to understanding their behavioral intentions and improve the students’ interaction efficiency.

INTERACTION METHOD OF INTELLIGENT EQUIPMENT
Data sending method
EXPERIMENTAL AND EVALUATION
Findings
CONCLUSION AND DISCUSSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call