Abstract

In this paper, we focus on 23 undergraduate students’ application of a universal design for learning (UDL) evaluation framework for assessing a massive open online course (MOOC) in the context of a usability and accessibility university course. Using a mixed-methods approach, we first report the extent to which untrained raters agree when evaluating their course with the framework and then examine their feedback on using UDL for assessment purposes. Our results indicate user feedback provides great value for both the future development of accessible MOOCs and identifies opportunities to improve the evaluation framework. For that purpose, we suggest an iterative process comprised of refining the framework while working with students and which could help students to internalise UDL principles and guidelines to become expert learners and evaluators. The complexities and redundancies that surfaced in our research, as reported in this paper, illustrate that there is variability in the perception of both the course design and the interpretation of the framework. Results indicate that UDL cannot be applied as a list of simple checkpoints, but also provide insights into aspects of the framework that can be improved to make the framework itself more accessible to students.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call