Abstract

Traditional behavior analysis relies upon single-subject study designs and visual inspection of graphed data to evaluate the efficacy of experimental manipulations. Attempts to apply statistical inferential procedures to analyze data have been successfully opposed for many decades, despite problems with visual inspection and increasingly cogent arguments to utilize inferential statistics. In a series of experiments, we show that trained behavior analysts often identify level shifts in responding during intervention phases (‘treatment effect’) in modestly autocorrelated data, but trends are either misconstrued as level treatment effects or go completely unnoticed. Errors in trend detection illustrate the liabilities of using visual inspection as the sole means by which to analyze behavioral data. Meanwhile, because of greatly increased computer power and advanced mathematical techniques, previously undeveloped or underutilized statistical methods have become far more sophisticated and have been brought to bear on a variety of problems associated with repeated measures data. I present several nonparametric procedures and other statistical techniques to evaluate traditional behavioral data to augment, not replace, visual inspection procedures.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.