Abstract

Vision-based hand gesture interactions are natural and intuitive when interacting with computers, since we naturally exploit gestures to communicate with other people. However, it is agreed that users suffer from discomfort and fatigue when using gesture-controlled interfaces, due to the lack of physical feedback. To solve the problem, we propose a novel complete solution of a hand gesture control system employing immersive tactile feedback to the user's hand. For this goal, we first developed a fast and accurate hand-tracking algorithm with a Kinect sensor using the proposed MLBP (modified local binary pattern) that can efficiently analyze 3D shapes in depth images. The superiority of our tracking method was verified in terms of tracking accuracy and speed by comparing with existing methods, Natural Interaction Technology for End-user (NITE), 3D Hand Tracker and CamShift. As the second step, a new tactile feedback technology with a piezoelectric actuator has been developed and integrated into the developed hand tracking algorithm, including the DTW (dynamic time warping) gesture recognition algorithm for a complete solution of an immersive gesture control system. The quantitative and qualitative evaluations of the integrated system were conducted with human subjects, and the results demonstrate that our gesture control with tactile feedback is a promising technology compared to a vision-based gesture control system that has typically no feedback for the user's gesture inputs. Our study provides researchers and designers with informative guidelines to develop more natural gesture control systems or immersive user interfaces with haptic feedback.

Highlights

  • Over the past few years, the demand for hand interactive user scenarios has been greatly increasing in many applications such as mobile devices, smart TVs, games, virtual reality, medical device controls, the automobile industry and even in rehabilitation [1,2,3,4,5,6,7,8]

  • Our proposed MLBP hand tracking offers real-time and accurate hand tracking, which is suitable for a real-time gesture control system with tactile feedback

  • The data acquisition was implemented in the Open Natural Interaction (OpenNI) platform, while other modules were implemented using C on a Windows machine with a 3.93-GHz Intel Core i7 870 and 8 GB

Read more

Summary

Introduction

Over the past few years, the demand for hand interactive user scenarios has been greatly increasing in many applications such as mobile devices, smart TVs, games, virtual reality, medical device controls, the automobile industry and even in rehabilitation [1,2,3,4,5,6,7,8]. There is strong evidence that human computer interface technologies are moving towards more natural, intuitive communication between people and computer devices [11]. Because of this reason, vision-based hand gesture controls have been widely studied and used for various applications in our daily life. Co-locating touch feedback is imperative for an immersive gesture control that can provide users with more of a natural interface. From this aspect, developing an efficiently fast and accurate 3D hand tracking algorithm is extremely important, but challenging, to achieve real-time, mid-air touch feedback

Objectives
Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.