Abstract
This paper provides a perceptually transparent rendering algorithm for an ultrasound-based mid-air haptic device. In a series of experiments, we derive a systematic mapping function relating from the device command value to final user's perceived magnitude of a mid-air vibration feedback. The algorithm is designed for the ultrasonic mid-air haptic interface that is capable of displaying vibro-tactile feedback at a certain focal point in mid-air through ultrasound phased array technique. The perceived magnitude at the focal point is dependent on input parameters, such as input command intensity, modulation frequency, and position of the focal point in the work-space. This algorithm automatically tunes these parameters to ensure that the desired perceived output at the user's hand is precisely controlled. Through a series of experiments, the effect of the aforementioned parameters on the physical output pressure are mapped, and the effect of this output pressure to the final perceived magnitude is formulated, resulting in the mapping from the different parameters to the perceived magnitude. Finally, the overall transparent rendering algorithm was evaluated, showing better perceptual quality than rendering with simple intensity command.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.