Abstract

Modern haptic interfaces are adept at conveying the large-scale shape of virtual objects, but they often provide unrealistic or no feedback when it comes to the microscopic details of surface texture. Direct texture-rendering challenges the state of the art in haptics because it requires a finely detailed model of the surface's properties, real-time dynamic simulation of complex interactions, and high-bandwidth haptic output to enable the user to feel the resulting contacts. This paper presents a new, fully realized solution for creating realistic virtual textures. Our system employs a sensorized handheld tool to capture the feel of a given texture, recording three-dimensional tool acceleration, tool position, and contact force over time. We reduce the three-dimensional acceleration signals to a perceptually equivalent one-dimensional signal, and then we use linear predictive coding to distill this raw haptic information into a database of frequency-domain texture models. Finally, we render these texture models in real time on a Wacom tablet using a stylus augmented with small voice coil actuators. The resulting virtual textures provide a compelling simulation of contact with the real surfaces, which we verify through a human subject study.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call