Abstract

Abstract: The COVID-19 pandemic has brought about significant changes in the perception of education, with Massive Open Online Course (MOOC) providers like Coursera witnessing a surge in millions of new user registrations on their platforms. However, despite the prevalence of online review systems in various industries, the MOOC ecosystem lacks a standardized or fully decentralized review system. We believe that there is an opportunity to utilize existing open MOOC reviews to create userfriendly and transparent reviewing systems, enabling learners to easily identify the top courses available. By leveraging the wealth of reviews already available in the MOOC ecosystem, we can create simpler and more transparent systems that empower users to make informed choices about the courses they enrol in. In our research, we conduct an analysis of reviews from the Coursera platform with the specific goal of determining the potential value of using NLP-driven sentiment analysis on textual reviews in providing valuable information to learners. By examining the sentiment expressed in the textual reviews, we aim to evaluate whether this approach can offer meaningful insights to learners in assessing the quality of MOOCs. The results of our research suggest that textual reviews may be a more advantageous choice compared to numeric ratings due to the disadvantages associated with numeric ratings, such as the potential for random or arbitrary selections. Our findings indicate that utilizing sentiment analysis on textual reviews could provide valuable information for learners in evaluating the quality of MOOCs. By relying on the rich and descriptive information conveyed through textual reviews, learners may be better equipped to make informed decisions when selecting courses on platforms like Coursera.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call