BackgroundHand gesture interfaces are dedicated programs that principally perform hand tracking and hand gesture prediction to provide alternative controls and interaction methods. They take advantage of one of the most natural ways of interaction and communication, proposing novel input and showing great potential in the field of the human-computer interaction. Developing a flexible and rich hand gesture interface is known to be a time-consuming and arduous task. Previously published studies have demonstrated the significance of the finite-state-machine (FSM) approach when mapping detected gestures to GUI actions. MethodsIn our hand gesture interface, we broadened the FSM approach by utilizing gesture-specific attributes, such as distance between hands, distance from the camera, and time of occurrences, to enable users to perform unique GUI actions. These attributes are obtained from hand gestures detected by the RealSense SDK employed in our hand gesture interface. By means of these gesture-specific attributes, users can activate static gestures and perform them as dynamic gestures. We also provided supplementary features to enhance the efficiency, convenience, and user-friendliness of our hand gesture interface. Moreover, we developed a complementary application for recording hand gestures by capturing hand keypoints in depth and color images to facilitate the generation of hand gesture datasets. ResultsWe conducted a small-scale user survey with fifteen subjects to test and evaluate our hand gesture interface. Anonymous feedback obtained from the users indicates that our hand gesture interface is adequately facile and self-explanatory to use. In addition, we received constructive feedback about minor flaws regarding the responsiveness of the interface. ConclusionsWe proposed a hand gesture interface along with key concepts to attain user-friendliness and effectiveness in the control of existing GUIs.
Read full abstract