Abstract

Stereoscopic display based on Virtual Reality (VR) can facilitate doctors to observe the 3 D virtual anatomical models with the depth cues, assist them in intuitively investigating the spatial relationship between different anatomical structures without mental imagination. However, there is few input device can be used in controlling the virtual anatomical models in the sterile operating room. This paper presents a cost-effective VR application system for demonstration of 3 D virtual anatomical models with non-contact interaction and stereo display. The system is integrated with hand gesture interaction and voice interaction to achieve non-contact interaction. Hand gesture interaction is implemented based on a Leap Motion controller mounted on the Oculus Rift DK2. Voice is converted into operation using Bing Speech for English language and Aitalk for Chinese language, respectively. A local relationship database is designed to record the anatomical terminologies to speech recognition engine to query these uncommon words. The hierarchical nature of these terminologies is also recorded in a tree structure. In the experiments, ten participants were asked to perform the evaluation on the proposed system. The results show that our system is more efficient than traditional interactive manner and verify the feasibility and practicability in the sterile operating room.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call