Abstract

Background: Implementation fidelity refers to the degree to which an intervention or programme adheres to its original design. This paper examines implementation fidelity in the Sound Start Study, a clustered randomised controlled trial of computer-assisted support for children with speech sound disorders (SSD).Method: Sixty-three children with SSD in 19 early childhood centres received computer-assisted support (Phoneme Factory Sound Sorter [PFSS] – Australian version). Educators facilitated the delivery of PFSS targeting phonological error patterns identified by a speech-language pathologist. Implementation data were gathered via (1) the computer software, which recorded when and how much intervention was completed over 9 weeks; (2) educators’ records of practice sessions; and (3) scoring of fidelity (intervention procedure, competence and quality of delivery) from videos of intervention sessions.Result: Less than one-third of children received the prescribed number of days of intervention, while approximately one-half participated in the prescribed number of intervention plays. Computer data differed from educators’ data for total number of days and plays in which children participated; the degree of match was lower as data became more specific. Fidelity to intervention procedures, competency and quality of delivery was high.Conclusion: Implementation fidelity may impact intervention outcomes and so needs to be measured in intervention research; however, the way in which it is measured may impact on data.

Highlights

  • Evidence-based practice (EBP) is core to the provision of health services throughout the world, and speech-language pathology is no exception (e.g. American Speech-LanguageHearing Association, 2005; Speech Pathology Australia, 2011)

  • The purpose of this paper is to report on the implementation fidelity of the computerassisted intervention program, Phoneme Factory Sound Sorter (PFSS – Australian version; Wren & Roulstone, 2013), delivered to children with sound disorders (SSD) in the Sound Start Study, a clustered randomised control trial

  • The adherence to the intervention protocol was examined via comparison of three data sets: (1) the computer software, which provided evidence of the number of days and games played by each child each week for the entire period of intervention, (2) educators’ records of number of intervention days, sessions, and games played each week on a hard-copy recording sheet; and (3) speech-language pathologists (SLP)’ fidelity scoring from videos of the intervention sessions

Read more

Summary

Introduction

Evidence-based practice (EBP) is core to the provision of health services throughout the world, and speech-language pathology is no exception (e.g. American Speech-LanguageHearing Association, 2005; Speech Pathology Australia, 2011). According to this reasoning, selecting and implementing a particular intervention because empirical research exists to support the approach would not equate to evidence-based practice, unless it was implemented as directed. Individualisation is recognised as an important component of intervention (Roth & Worthington, 2015) For this reason, Dollaghan (2007) suggested that engaging in EBP requires clinicians to consider and integrate multiple forms of evidence in clinical practice: empirical research, clinical experience, and information from clients. Empirical research might exist to support the use of a particular intervention approach, but clinicians need to adapt the approach in the clinical setting due to issues including time, resourcing, or client characteristics (Roulstone, Wren, Bakapoulou, & Lindsay, 2012). Conclusion: Fidelity may impact intervention outcomes and so needs to be measured in intervention research; the way in which implementation fidelity is measured may impact on data

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call