Abstract

The usability of virtual keyboard based eye-typing systems is currently limited due to the lack of adaptive and user-centered approaches leading to low text entry rate and the need for frequent recalibration. In this work, we propose a set of methods for the dwell time adaptation in asynchronous mode and trial period in synchronous mode for gaze based virtual keyboards. The rules take into account commands that allow corrections in the application, and it has been tested on a newly developed virtual keyboard for a structurally complex language by using a two-stage tree-based character selection arrangement. We propose several dwell-based and dwell-free mechanisms with the multimodal access facility wherein the search of a target item is achieved through gaze detection and the selection can happen via the use of a dwell time, soft-switch, or gesture detection using surface electromyography in asynchronous mode; while in the synchronous mode, both the search and selection may be performed with just the eye-tracker. The system performance is evaluated in terms of text entry rate and information transfer rate with 20 different experimental conditions. The proposed strategy for adapting the parameters over time has shown a significant improvement (more than 40%) over non-adaptive approaches for new users. The multimodal dwell-free mechanism using a combination of eye-tracking and soft-switch provides better performance than adaptive methods with eye-tracking only. The overall system receives an excellent grade on adjective rating scale using the system usability scale and a low weighted rating on the NASA task load index, demonstrating the user-centered focus of the system.

Highlights

  • Several types of modalities have been recently evaluated on natural user interface design for intuitive interaction with computers

  • We address these issues with the following novel major contributions: (1) a set of methods for the adaptation over time of the dwell time in asynchronous mode, (2) a set of methods for the adaptation of the trial period in synchronous mode, and (3) a benchmark with beginner users of several dwell-based and dwell-free mechanisms with the multimodal access facility wherein the search of a target item is achieved through gaze detection and the selection can happen via the use of a dwell time, soft-switch, or gesture detection using surface electromyography in asynchronous mode; and the search and selection may be performed with eye-tracker in synchronous mode

  • This paper presented an efficient set of methods and rules for the adaptation over time of gaze-controlled multimodal virtual keyboards in synchronous and asynchronous modes

Read more

Summary

Introduction

Several types of modalities have been recently evaluated on natural user interface design for intuitive interaction with computers. Electroencephalogram (EEG) based brain-computer interface (BCI), eye-tracking based human–computer interface (HCI), electromyography (EMG) based gesture recognition, speech recognition, and different input access switches have been adopted for natural user interface methods [1,2,3,4] Among these approaches, eyetracking considers the position of the eye relative to the head, and the orientation of the eyes in space, or the point of regard. A remotecamera-based method wherein the gaze position is captured through non contacting fixed cameras without any additional equipment or support In this case, because the image resolution for the eye is relatively low, pupil tremors cause severe vibrations of the calculated gaze point. The choice of an effective dwell time has encouraged some researchers to propose adaptive strategies for the choice of the dwell time [19,20] With such enhancements, users can select desired items increasing the overall systems performance. One of the key drawbacks of this method is the requirement of extra selection time

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call