Abstract

Virtual chemistry experiment is an important teaching tool in middle schools. It can not only help students understand the experimental principles, but also help them memorize the experimental procedures. However, there are still some problems in virtual chemistry experiments. First, in existing research, user intentions are often misunderstood and cannot be accurately understood. Second, the existing research fail to identify the user’s wrong actions, which reduce the accuracy of the experiment. Third, the user sense of operation and realism is not strong during the experiment, which reduces the user’s experience. Finally, the lack of navigation guidance for experimental operations increases user learning time. In order to solve these problems. This paper proposes a scheme for establishing an intelligent navigational chemical laboratory based on multimodal fusion. First of all, we design a new smart beaker structure with perceptual ability, which can be used to complete most chemical experiments and give users a real sense of experience. Besides, we propose a multimodal fusion understanding algorithm, which reduces the misidentification of the experiment and better understands the real intention of the user. Finally, intelligent navigation and wrong behavior recognition functions are added to the experimental equipment, which improves the efficiency of human-computer interaction. The results show that Compared with the existing virtual laboratory or system, the chemical laboratory scheme proposed in this paper through multimodal fusion understanding algorithm greatly reduces the user’s memory load and improves the success rate of the experiment. Moreover, through the combination of virtual and real, the virtual chemistry experiment not only improves the authenticity of the operation, but also stimulates the students’ interest in learning, which is well received by users.

Highlights

  • Many of the experiments in middle school chemistry textbooks are destructive, costly, and relatively dangerous

  • We propose a multimodal fusion understanding algorithm, through which the user’s behavior can be perceived and intention understanding can be realized, and misidentification can be reduced

  • In navigation interaction paradigm based on multimodal fusion understanding, visual presentation uses information enhancement to show the effects of chemical reactions in a virtual scene

Read more

Summary

INTRODUCTION

Many of the experiments in middle school chemistry textbooks are destructive, costly, and relatively dangerous. Using the interactive method of multimodal fusion can use one modality to make up for the problems about ‘‘incomprehension’’, ‘‘incompleteness’’ or ‘‘misunderstanding’’ of the intention generated by another modality It can eliminate the ambiguity caused by only relying on one modal input information to understand the intention, so that the computer can understand the user’s intention more accurately, and the process of human-computer interaction is more efficient and natural. Some studies use a single interaction method, or lack of understanding and feedback of user intentions, or lack of guidance and prompts for the experimental process, which will affect the user’s sense of operation and the fluency of the experiment.

USER BEHAVIOR PERCEPTION METHOD
MULTIMODAL INFORMATION PERCEPTION AND RECOGNITION
MULTIMODAL FUSION UNDERSTANDING MODAL
EXPERIMENTAL AND EVALUATION
Findings
CONCLUSION

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.