Abstract

Human–computer interaction is transformed as well as the technology used to achieve it. This paper discusses the ease of use of a classic touch screen user interface (TUI) in contrast to a gesture-based user interface (GBUI) applied to a mobile device. Two basic positions are used for comparison: “normal” when individual is standing and “special” when individual is lying down. The first represents a common position for healthy individuals, while the second is used to represent sick, disabled or comfortable individuals. Two software applications have been developed in order to set an interaction test for TUI and GBUI, which was performed by 25 users. The interaction tests show that GBUI has a very reasonable average accuracy of 88.5% in special position and 89.8% in normal position, while TUI presents an average accuracy of 96.5% in special position and 97.4% in normal position. Therefore, GBUI could be a complementary form of interaction to TUI that can be useful to any user, especially for cases of illness or simple comfort of the user. As study case, we have developed a software solution called SICLLE (emergency call control interactive system) that manages phone calls in a smartphone and applies the concept of gestural control tree to set a gestural navigation through a gestural language from a gestural vocabulary. An architecture for mobile devices supporting GBUI has been proposed and used in the implementation of SICLLE. SICLLE allows the personalization of the gestures, which optimizes the gestural dictionary and achieves an easier learning and use. In addition, SICLLE incorporates a hearing guide which replaces the typical visual control, reaching total acceptance in the experiment conducted with real users.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call