Open educational resources (OER) have been praised for revolutionizing education. However, practitioners and instructors battle keeping OER updated and measuring their impact on students’ performance. Few studies have analyzed the improvement of OER over time in relation to achievement. This longitudinal study uses learning analytics through the open-source Resource Inspection, Selection, and Enhancement (RISE) analysis framework to assess the impact of continuous improvement cycles on students’ outcomes. Panel data (i.e., performance and use) from 190 learning objectives of OER of an introductory sociology course were analyzed using a hierarchical linear model. Results show that more visits to an OER do not improve student achievement, but continuous improvement cycles of targeted OER do. Iterative implementation of the RISE analysis for resource improvement in combination with practitioners’ expertise is key for students’ learning. Given that the RISE classification accounted for 65% of the growth of students’ performance, suggesting a moderate to large effect, we speculate that the RISE analysis could be generalized to other contexts and result in greater student gain. Institutions and practitioners can improve the OER’s impact by introducing learning analytics as a decision-making tool for instructional designers. Yet, user-friendly implementation of learning analytics in a “click-and-go” application is necessary for generalizability and escalation of continuous improvement cycles of OER and tangible improvement of learning outcomes. Finally, in this article, we identify the need for efficient applications of learning analytics that focus more on “learning” and less on analytics.
Read full abstract