Abstract
Manual accessibility evaluation plays an important role in validating the accessibility of Web pages. This role has become increasingly critical with the advent of the Web Content Accessibility Guidelines (WCAG) 2.0 and their reliance on user evaluation to validate certain conformance measures. However, the role of expertise, in such evaluations, is unknown and has not previously been studied. This paper sets out to investigate the interplay between expert and non-expert evaluation by conducting a Barrier Walkthrough (BW) study with 19 expert and 51 non-expert judges. The BW method provides an evaluation framework that can be used to manually assess the accessibility of Web pages for different user groups including motor impaired, hearing impaired, low vision, cognitive impaired, etc. We conclude that the level of expertise is an important factor in the quality of accessibility evaluation of Web pages. Expert judges spent significantly less time than non-experts; rated themselves as more productive and confident than non-experts; and ranked and rated pages differently against each type of disability. Finally, both effectiveness and reliability of the expert judges are significantly higher than non-expert judges.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.