Abstract

Among the technological advances in touch-based devices, gesture-based interaction have become a prevalent feature in many application domains. Information systems are starting to explore this type of interaction. As a result, gesture specifications are now being hard-coded by developers at the source code level that hinders their reusability and portability. Similarly, defining new gestures that reflect user requirements is a complex process. This paper describes a model-driven approach to include gesture-based interaction in desktop information systems. It incorporates a tool prototype that captures user-sketched multi-stroke gestures and transforms them into a model by automatically generating the gesture catalogue for gesture-based interaction technologies and gesture-based user interface source codes. We demonstrated our approach in several applications ranging from case tools to form-based information systems.

Highlights

  • New devices are appearing with new types of user interfaces

  • This paper introduces an modeldriven development (MDD) approach and a tool for the inclusion of gesture-based interaction in developing user interfaces for information systems, which is intended to allow software engineers to focus on the key aspects of these interfaces

  • The method consists of a modelling language to represent multi-stroke gestures and a set of multi-platform model transformations. (b) We provide a tool that supports the method by capturing the multi-stroke gestures sketched by the users, transforming these gestures into a model, and automatically generating the gesture catalogue and the source code to include gesture-based interaction in Information systems (IS) user interfaces. (c) The approach was evaluated in two stages: (i) by applying it to three different gesture-based interaction technologies, namely, $N [6], quill [7], [8] and SiGeR/iGesture [9], and (ii) by applying it to include gesture-based interaction in user interface development process in a formsbased IS

Read more

Summary

Introduction

New devices are appearing with new types of user interfaces (e.g., interfaces that are based on gaze, gesture, voice, haptic, and brain-computers). It has a negative impact on reusability and portability, and complicates the definition of new gestures Some of these challenges can be resolved by following a modeldriven development (MDD) approach, provided that gestures and gesture-based interaction can be modelled and that it is possible to automatically generate the software components that support them. This paper introduces an MDD approach and a tool for the inclusion of gesture-based interaction in developing user interfaces for information systems, which is intended to allow software engineers to focus on the key aspects of these interfaces. (b) We provide a tool that supports the method by capturing the multi-stroke gestures sketched by the users, transforming these gestures into a model, and automatically generating the gesture catalogue and the source code to include gesture-based interaction in IS user interfaces. Appendix A describes the inclusion of gestUI in: (i) MARIA [11], an existing model-driven method for user interface development and (ii) Model-Based User Interface Design (MBUID) Specification [12] defined by World Wide Web Consortium (W3C)

Stroke-based Gestures
Model-driven Related Concepts
Gesture Test Frameworks
Gesture Representation
The Role of Gesture-based Interfaces in IS Engineering
Introduction to gestUI Method
The gestUI Tool
Demonstration of the Method and Tool Support
Summary and Future Work
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call