Abstract

Oral narrative assessments are important for diagnosis of language disorders in school-age children so scoring needs to be reliable and consistent. This study explored the impact of training on the variability of story grammar scores in children’s oral narrative assessments scored by multiple raters. Fifty-one speech pathologists and 19 final-year speech pathology students attended training workshops on oral narrative assessment scoring and analysis. Participants scored two oral narratives prompted by two different story stimuli and produced by two children of differing ages. Demographic information, story grammar scores and a confidence survey were collected pre- and post-training. The total story grammar score changed significantly for one of the two oral narratives. A significant effect was observed for rater years of experience and the change in total story grammar scores post training, with undergraduate students showing the greatest change. Two story grammar elements, character and attempt, changed significantly for both stories, with an overall trend of increased element scores post-training. Confidence ratings also increased post-training. Findings indicated that training via an interactive workshop can reduce rater variability when using researcher-developed narrative scoring systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call