Abstract

As a means for the visually impaired to operate graphical user interface (GUI) applications, screen-reader software that combines keypad operation and a speech synthesizer is emerging. When keypad operation is used, however, the search for GUI objects requires more time compared to mouse operation. In other words, the difference between the sighted and the visually impaired in information access has gradually been enhanced with the advent of GUI, and it is desired to develop a system by which the visually impaired can operate GUIs comfortably. From such a viewpoint, the authors propose a nonvisual representation of the GUI object, by representing the hierarchy and the functions of GUI objects by tactile sensations, and the labels of the objects by voice. In order to examine the effectiveness of the proposed method, an evaluation experiment with visually impaired subjects was performed. The results are analyzed, and it is shown that the search process proposed by the authors can find the desired GUI object more quickly with less burden to the user than the search using a keypad. It is also seen that the visually impaired user employs direct operations in the search for GUI objects using tactile sensations. © 1999 Scripta Technica, Electron Comm Jpn Pt 3, 82(8): 40–49, 1999

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.