Abstract

Although eye-based interactions can be beneficial for people with motor impairments, they often rely on clunky or specialized equipment (e.g., stationary eye-trackers) and focus primarily on gaze and blinks. However, two eyelids can open and close in different orders and for different duration to form rich eyelid gestures. We take a first step to design, detect, and evaluate a set of eyelid gestures for people with motor impairments on mobile devices. We present an algorithm to detect nine eyelid gestures on smartphones in real time and evaluate it with 12 able-bodied people and 4 people with severe motor impairments in two studies. The results of the study with people with motor-impairments show that the algorithm can detect the gestures with .76 and .69 overall accuracy in user-dependent and user-independent evaluations. Furthermore, we design and evaluate a gesture mapping scheme for people with motor impairments to navigate mobile applications only using eyelid gestures. Finally, we discuss considerations for designing and using eyelid gestures for people with motor impairments.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.