Visual analysis is the primary methodology used to determine treatment effects from graphed single-case design data. Previous studies have demonstrated mixed findings related to interrater agreement between both expert and novice visual analysts, which represents a critical limitation of visual analysis and supports calls for also presenting statistical analyses (i.e., measures of effect size). However, few single-case design studies include results of both visual and quantitative analyses for the same set of data. The present study investigated whether blind review of single-case graphs constructed using up-to-date recommendations by experts in visual analysis would demonstrate adequate interrater agreement and have correspondence with an effect size metric, log response ratio. Eleven experts (i.e., professors in school psychology and special education with visual analysis experience) analyzed 26 multiple-baseline graphs evaluating implementation planning, a fidelity support, on educators' implementation and student outcomes, presented in a standardized format without indication of the variable being measured. Results suggest that there was strong correspondence between raters in their judgments of the presence or absence of treatment effects and meaningfulness of effects (particularly for graphs of adherence and quality). Additionally, a quadratic relationship was observed between aggregate results of expert visual analysis and effect size statistics. Implications for future research and limitations are discussed. (PsycInfo Database Record (c) 2024 APA, all rights reserved).
Read full abstract