Abstract

Formative feedback from students can help college instructors improve their online teaching practices - especially instructors who are new to online teaching. Prior research indicates that mid-semester formative evaluations of college teaching are a promising, low-cost solution to providing online instructors with in-the-moment feedback. However, existing instruments suffer from issues of validity and bias, and fail to align with evidence-based strategies. In this paper, we present psychometric results from a pilot study of our research-based Mid-Semester Evaluation of College Teaching (MSECT) to assist online educators in gathering student input to improve their online teaching and classroom climate.

Highlights

  • Mid-semester formative evaluations of college teaching are a promising, low-cost solution to providing online instructors with in-the-moment feedback to improve their online teaching practices

  • The purpose of this paper is to present the design and validation of a pilot study of the Mid-Semester Evaluation for College Teaching (MSECT) for Online Instructors

  • The Confirmatory Factor Analyses (CFA) produced evidence that the reduced MSECT-O instrument is a valuable tool because it provides online instructors with formative feedback specific to the four factors of teaching effectiveness outlined in the Fearless Teaching Framework

Read more

Summary

Introduction

Mid-semester formative evaluations of college teaching are a promising, low-cost solution to providing online instructors with in-the-moment feedback to improve their online teaching practices. Faculty development programs (e.g., coaching, workshops) are adopted to address this disconnect and train new online instructors (Meyer & Murrell, 2014) These programs, require time and resources (Meyer, 2014) that many universities cannot offer at the scale of online learning growth. A low-cost approach for supporting online instructors is to gather mid-semester formative feedback from students, tailored to evidence-based teaching practices. MSE feedback has the potential to provide just-in-time feedback to new online instructors; the field lacks an evidence-based and valid MSE instrument for online students to reflect on the online instruction. Critics say SET responses are biased because of instructor identity, the data are too general to inform faculty, and items are not developed using evidence-based psychometrics (e.g., Hammonds, Mariano, Ammons, & Chambers, 2017)

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call