Abstract

We develop a virtual reality based driving training system of self-propelled gun (SPG). In order to make the interface of the system more powerful and natural, hand gesture interaction need to be incorporated into the system's interface. This paper discusses the use of hand gestures for interaction with the virtual training environment. We employ static hand gestures which coupled with hand translations and rotations as the method of interacting with the virtual training environment. An 18-sensor data glove is chosen for monitoring the movements of the fingers and the wrist. The feed-forward neural network is developed for recognizing gestures for use in virtual training application of artillery self-propelled gun (SPG). We present our approach for the algorithm design and implementation, and the use of the gestures in our application. The presented hand gesture interaction method can be effectively used in our virtual reality training system of SPG to perform various manipulating tasks in a more fast, precise, and natural way

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call