Abstract

AbstractThis study investigated the characteristics of peer discussions used to support formative assessment in lectures, facilitated by a student response system, in an undergraduate qualitative methods course for psychology students. The intent was to examine the characteristics of peer discussions in which student response systems are used to facilitate the practice of formative assessment lectures. The research was guided by the following research questions: (1) What patterns of talk can be identified in the discussions? (2) How do the students use subject‐specific vocabulary in the discussions? (3) How the students’ understanding of the subject matter displayed in these discussions? To examine the characteristics of peer interactions, 87 student discussions were recorded and analysed. The concept of exploratory talk was used as a lens to examine the discussions. In 68 of the 87 discussions, the students exchanged ideas and elaborated on their peers’ ideas and understanding of the concepts. In the remaining 25 discussions, the process of reasoning was less visible. The findings are relevant for teaching designs that aims to use digital tools to facilitate formative assessment. Practitioner NotesWhat is already known about this topic Student response systems can support formative assessment and feedback in lectures. The most common approaches used in research on student response systems to support formative assessment is questionnaires or interviews. Few studies have provided detailed analyses of clicker‐supported peer discussions. What the paper adds Provides insights into the micro‐processes in clicker‐supported discussions. Critically discusses the role of discussions facilitated by student response systems to support formative feedback in the classroom. The paper contributes to scholarship in this field by drawing attention to the qualities of the activities created for formative assessment. Discusses the validity of the inferences drawn from the use of student response systems in the classroom. Implications for policy and practice Analysis of discussions shows that there is not necessarily a correlation between aggregated answers and students' understanding. To ensure that valid inferences can be drawn from activities as a basis for feedback, the questions used and threats to the validity of possible inferences should be critically examined.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call