Abstract

We conducted an experiment where participants carried out six gaze gesture tasks. The gaze paths were analyzed to find out the speed and accuracy of the gaze gestures. As a result, the gaze gestures took more time than we anticipated and only the very fastest participants got close to what was expected. There was not much difference in performance times between small and large gaze gestures, because the participants reached significantly faster speed when making large gestures than small gestures. Curved shapes were found difficult to follow and time-consuming when followed properly. In general, the accuracy in following shapes was sometimes very poor. We believe that to improve the speed and accuracy of gaze gestures, proper feedback must be used more than in our experiment.

Highlights

  • Gestures are a familiar concept from mouse and penbased interaction

  • Gaze gestures are basically short snippets from the user’s gaze path and they should be interpreted as commands issued by the user, which makes it challenging to separate gaze gestures from looking around and from other actions

  • One claimed advantage of gaze gestures is that they do not need to be accurate, we still were especially interested in the accuracy

Read more

Summary

Introduction

Gestures are a familiar concept from mouse and penbased interaction. Gestures have only recently started to interest researchers. Gaze gestures are basically short snippets from the user’s gaze path and they should be interpreted as commands issued by the user, which makes it challenging to separate gaze gestures from looking around and from other actions. Some of the biggest challenges in gaze interaction are calibration and accuracy. Calibration problems often occur if the user shifts position during the interaction or if the initial calibration has been done poorly. Drewes and Schmidt (2007) claimed that the use of gaze gestures would solve these problems, since gaze gestures can be used without any calibration if designed properly The calibration becomes less accurate during the use, and the user has to recalibrate the eye tracker (Hansen et al, 2008). Drewes and Schmidt (2007) claimed that the use of gaze gestures would solve these problems, since gaze gestures can be used without any calibration if designed properly

Methods
Results
Discussion
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.