Abstract

The use of eModeration (of examination scripts) can improve the efficiency of the examination moderation, while simultaneously lowering the risk of misplacing or delaying the moderation process. Despite the potential benefits of using an eModerate system in terms of optimising examination procedures, the implementation and application of such online moderation systems in the South African context is limited. Various factors could be contributing towards the resistance to the implementation and adoption of eModerate systems in higher education institutions. These factors include human factors as well as technical and organisational resistance to change. This study focuses on the human factors involved in eModeration (user experience) and attempts to answer the following research question: How can the User Experience Evaluation Framework for eModeration be utilised within the context of higher education institutions in South Africa? The research used a Design Science Research methodology, which included the design, development as well as testing of the User Experience Evaluation Framework for eModeration. This paper will report on identified issues pertaining to the User Experience Evaluation Framework for eModeration during the evaluation phase. The research was conducted at Midrand Graduate Institute (MGI), now trading as Pearson Institute of Higher Education, a private higher education institute in South Africa. The data generation methods included interviews with eModerators from different faculties within a private higher education institution. This paper makes a theoretical contribution to this area of study by identifying the problems that users might have with the implementation of the User Experience Evaluation Framework for eModeration as well as providing some insights into the user experience of eModerators.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call