Student-created videos engage students’ interests, creativity, and content knowledge and enrich collaborative learning in STEM education. These videos enhance critical thinking and analytical skills, which are essential tools in the fields of science, technology, engineering, and mathematics (STEM). This study presents the results of such an assignment across several STEM areas (biology, chemistry, exercise science, information technology, and mathematics) at a minority-serving, liberal arts higher education institution in the southeast region of the United States. Undergraduate students (n = 557) across varied, online course modalities (synchronous and asynchronous) were required to create four problem-solving videos 3–8 min in duration. Assessment tools included a self-assessment of learning gains survey given to control and experimental groups and a post-video creation survey given only to experimental groups. Grade data was also collected from all sections. Comparing the experimental and control groups, students showed a statistically significant gain in their ability to give oral presentations, create videos, and edit videos. Qualitative data from free-response questions corroborate these gains and suggest that students also grew in content knowledge and conceptual understanding through these assignments. Our study implements a multimedia theoretical framework which suggests students learn more effectively from consuming presentations with both auditory and visual components. Our results suggest students see similar gains from producing presentations with both auditory and visual components. Further our results suggest that multimedia production enhances students’ presentation skills. From a practical perspective, this study suggests that faculty should incorporate student-created videos in online classes, which typically require oral presentations in person. Faculty are also advised to require oral and visual components within these videos to maximize learning gains from the perspective of a multimedia theoretical framework.
Read full abstract