Abstract
Due, in part, to the popularity of online shopping, there is considerable interest in enabling consumers to experience material touch via internet connected devices. While there have been several efforts to render texture via electrostatic tactile displays, the textures involved have typically consisted of synthetic patterns, such as shapes, shadings, or gradients of photographic textures. In this paper, we propose a data-driven algorithm for the haptic rendering of fabric textures on an electrostatic tactile display. We measure the friction force, normal force, and displacement during the swiping of a finger across real fabric using a new measurement apparatus introduced here. Using these measurements, we compute friction coefficients derived from the recorded frictional and normal forces. We then reproduce the friction coefficients by controlling the voltage applied to an electrostatic tactile display in order to render the tactile texture of the measured fabric. In order to evaluate this rendering method, we conducted a psychophysical experiment that assessed the visual and haptic similarity of ten real and simulated fabrics. The experimental results show that the virtual textures generated using this electrostatic rendering algorithm were perceptually similar to the corresponding real textures for all fabrics tested, underlining the promise of electrostatic tactile displays for material simulation.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.