Abstract

Generation of stable and realistic haptic feedback during mid-air gesture interactions have recently garnered significant research interest. However, the limitations of the sensing technologies such as unstable tracking, range limitations, nonuniform sampling duration, self occlusions, and motion recognition faults significantly distort motion based haptic feedback to a large extent. In this paper, we propose and implement a hidden Markov model (HMM)-based motion synthesis method to generate stable concurrent and terminal vibrotactile feedback. The system tracks human gestures during interaction and recreates smooth, synchronized motion data from detected HMM states. Four gestures—tapping, three-fingered zooming, vertical dragging, and horizontal dragging—were used in the study to evaluate the performance of the motion synthesis methodology. The reference motion curves and corresponding primitive motion elements to be synthesized for each gesture were obtained from multiple subjects at different interaction speeds by using a stable motion tracking sensor. Both objective and subjective evaluations were conducted to evaluate the performance of the motion synthesis model in controlling both concurrent and terminal vibrotactile feedback. Objective evaluation shows that synthesized motion data had a high correlation for shape and end-timings with the reference motion data compared to measured and moving average filtered data. The mean R^{2} values for synthesized motion data was always greater than 0.7 even under unstable tracking conditions. The experimental results of subjective evaluation from nine subjects showed significant improvement in perceived synchronization of vibrotactile feedback based on synthesized motion.

Highlights

  • The advent of affordable, small-sized motion tracking sensors such as Leap Motion1 and Kinect2, have made mid-air interactions viable in new application areas such as desktop computers, interactive tabletops, and inside cars

  • Motion elements were synthesized based on recognized gestures to control the vibrotactile feedback

  • Four gestures were used in the study to evaluate the performance of the motion synthesis method

Read more

Summary

Introduction

The advent of affordable, small-sized motion tracking sensors such as Leap Motion and Kinect, have made mid-air interactions viable in new application areas such as desktop computers, interactive tabletops, and inside cars. Portable virtual and augmented reality interfaces have recently accelerated the use of mid-air interactions as a human–computer interaction technique. Haptic feedback plays an import role in mid-air gesture interactions to give information about the physical presence of objects, which the users are interacting. Previous research has shown that delivering appropriate haptic cues to users during gesture input can help in gesture training [1] and task performance [2], as well as improve overall user experience [3]. Full list of author information is available at the end of the article.

Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.