AbstractMore and more systematic reviews (SRs) are being published in the educational sciences. This umbrella mapping review examines 576 SRs published between 2018 and 2022 in the field of open, distance, and digital education (ODDE) to investigate publication and authorship patterns and to evaluate the quality of these SRs. A quality index score was calculated for each included study based on the PRISMA reporting items for SRs (including elements such as the search strategy, eligibility criteria, protocol registration, study quality appraisal, interrater reliability, etc.). Almost as many SRs were published in 2022 as in the previous four years and the most rigorous SRs come from the field of medical education. However, the results show that there is room for improvement in SRs published in ODDE. A content analysis that explored the thematic scope of SRs showed that the majority of SRs addressed topics related to learning design, AI in education, and the effectiveness of online learning and teaching interventions. Research during this time period was strongly influenced by the experiences with online learning during the COVID‐19 pandemic. The results of this umbrella review should help to improve the quality of SRs towards reproducible reviews in ODDE. Context and implicationsRationale for this study: The field of Open, Distance, and Digital Education (ODDE) is in transition and an umbrella mapping review is warranted given the dynamic growth of SRs in this literature.Why the new findings matter: The quality of many published SRs limits the reproducibility and validity of the presented research evidence—the results of the present quality appraisal call for more rigorous SRs in ODDE.Implications for researchers: Conducting SRs is a fruitful exercise for individual researchers and research institutions to gain a solid overview of a given topic. However, researchers must be trained in the appropriate methodology for the results to be reproducible and valid.Implications for practitioners and policy‐makers: Systematic reviews can be a valuable source to inform practice and policy‐making. However, attention must be paid to the quality of the reviews; if the method is not carried out accurately, the results must be interpreted with caution. Journal editors: As gatekeepers responsible for ensuring journal quality, editors should invite experts for the SR methodology to be members of the editorial team to handle the peer‐review process of submissions. It must be guaranteed that only methodologically sound SRs are published.
Read full abstract