Abstract

Gestures are a natural input method for human communication and may be effective for drivers to interact with in-vehicle infotainment systems (IVIS). Most of the existing work on gesture-based human-computer interaction (HCI) in and outside of the vehicle focus on the distinguishability of computer systems. The purpose of this study was to identify gesture sets that are used for IVIS tasks and to compare task times across the different functions for gesturing and touchscreens. Task times for user-defined gestures were quicker than for a novel touchscreen. There were several functions that resulted in relatively intuitive gesture mappings (e.g., zooming in and zooming out on a map) and others that did not have strong mappings across participants (e.g., decreasing volume and playing the next song). The findings of this study suggest that user-centric gestures can be utilized to interact with IVIS systems instead of touchscreens, and future work should evaluate how to account for variability in intuitive gestures. Understanding the gesture variability among the end users can support the development of an in-vehicle gestural input system that is intuitive for all users.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call