Abstract

Concomitant with the advent of the ubiquitous era, research into better human computer interaction (HCI) for human-focused interfaces has intensified. Natural user interface (NUI), in particular, is being actively investigated with the objective of more intuitive and simpler interaction between humans and computers. However, developing NUI-based applications without special NUI-related knowledge is difficult. This paper proposes a NUI-specific SDK, called “Gesture SDK,” for development of NUI-based applications. Gesture SDK provides a gesture generator with which developers can directly define gestures. Further, a “Gesture Recognition Component” is provided that enables defined gestures to be recognized by applications. We generated gestures using the proposed SDK and developed a “Smart Interior,” NUI-based application using the Gesture Recognition Component. The results of experiments conducted indicate that the recognition rate of the generated gestures was 96% on average.

Highlights

  • The major advantage of ubiquitous computing is that it enables users to use computers and networks in natural and intuitive ways

  • This paper proposes a natural user interface (NUI)-specific SDK, called “Gesture SDK,” for development of NUI-based applications

  • This section introduces “Smart Interior,” a NUI-based application developed using the gesture SDK proposed in this paper

Read more

Summary

Introduction

The major advantage of ubiquitous computing is that it enables users to use computers and networks in natural and intuitive ways. Research related to ubiquitous concepts has been playing a significant role in realizing the future of computing, in which computers unobtrusively support humans in everyday life In this context, human computer interaction (HCI) is being actively investigated to facilitate the implementation of human-focused computer environments. The interaction approach using gestures facilitates a more natural interface and enables users to intuitively use computer systems [6]. The proposed approach shows the identification area on the screen and thereby enables users to obtain feedback on the execution of a gesture by watching the screen This system is suitable for application development because gestures are quickly recognized. This paper proposes a hand gesture SDK that facilitates development of NUI-based applications without any specific NUI knowledge.

Related Work
The Gesture SDK
Experiment and Analysis
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call