Abstract

ABSTRACT Knowledge-sharing in forums is an integral part of many MOOCs (Massive Open Online Courses). However, forum usage for knowledge-sharing in MOOCs is often inadequate. This study adopts a mixed-methods approach to investigate problems behind MOOC learners’ problematic forum participation and propose real-time sharing-quality-monitoring mechanisms to mitigate the problems. We explore different designs and implementation of computerised nudges to enhance knowledge contribution, considering challenges such as vast data, user aversion to AI monitoring, and complex user interactions. Through testing graphical (Model A), numerical (Model B), and textual message (Model C) interface designs, we found that graphical and numerical designs were most effective in improving performance. However, Model C received conflicting judgments, with some users feeling controlled by the AI while others found algorithmic guidance valuable. Our findings shed light on leveraging computerised nudges for meaningful contributions and address concerns related to AI monitoring. The complex nature of user interactions, behaviours, and the abundance of data present significant challenges that require innovative approaches. This study contributes to understanding the issues in MOOC forum participation and provides insights into effective computerised nudges. We discuss directions for refining the current design, emphasising the need for more design science research in this domain.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call