Abstract

Teachers’ knowledge and skills about data-based instruction (DBI) can influence their self-efficacy and their implementation of DBI with fidelity, ultimately playing a crucial role in improving student outcomes. The purpose of this brief report is to provide evidence for the technical adequacy of a measure of DBI knowledge and skills in writing by examining its internal consistency reliability, considering different factor structures, and assessing item statistics using classical test theory and item response theory. We used responses from 154 elementary school teachers, primarily special educators, working with children with intensive early writing needs. Results from confirmatory factor analysis did not strongly favor either a one-factor solution, representing a single dimension of DBI knowledge and skills, or a two-factor solution, comprising knowledge and skills subscales. Internal consistency reliability coefficients were within an acceptable range, especially with the one-factor solution assumed. Item difficulty and discrimination estimates varied across items, suggesting the need to further investigate certain items. We discuss the potential of using the DBI Knowledge and Skills Assessment, specifically in the context of measuring teacher-level DBI outcomes in writing.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call