Abstract

Virtual environments (VEs) that enable the user to touch, feel, and manipulate virtual objects through haptic interactions are expected to have applications in many areas such as medicine, CAD/CAM, entertainment, fine arts, and education. The current state of technology allows the human operator to interact with virtual objects through the probe (such as a thimble or a stylus) of a force-reflecting haptic interface. Most of the current haptic interaction algorithms model the probe as a single point and allow the user to feel the forces that arise from point interactions with virtual objects. In this paper, we propose a ray-based haptic-rendering algorithm that enables the user to touch and feel convex polyhedral objects with a line segment model of the probe. The ray-based haptic-rendering algorithm computes both forces and torques due to collisions of the tip and/or side of the probe with multiple virtual objects, as required in simulating many tool-handling applications. Since the real-time simulation of haptic interactions between a 3D tool and objects is computationally quite expensive, the ray-based rendering can be considered as an intermediate step toward achieving this goal by simplifying the computational model of the tool. To compare the ray- and point-based haptic interaction techniques in the haptic perception of 3D objects, we conducted perceptual experiments in which the participants were asked to identify the shape of four different 3D primitives (sphere, cone, cylinder, and cube) that were displayed in random order using both point-and ray-based techniques. The results of the study show that on average, 3D objects are recognized faster with ray-based rendering than with point-based rendering.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call