Abstract

The system of least prompts response prompting procedure has a rich history in special education research and practice. Recently, two independent systematic reviews were conducted to determine if the system of least prompts met criteria to be classified as an evidence-based practice. Both reviews used single-case design standards developed by What Works Clearinghouse to evaluate the rigor and effects of studies; however, findings and implications varied significantly across reviews. We examined the data supporting each review and discuss how two reviews on the same topic area using the same standards for evaluating studies could arrive at different conclusions. Results indicate that varying search parameters, visual analysis protocols, and the flexibility allotted by the design standards may have contributed to differences. We discuss the importance of multiple literature reviews on the same topic area with regard to replication research in special education. In addition, we highlight the necessity of open data in such reviews. Finally, we recommend how practitioners and researchers should collectively interpret the differing findings and conclusions from the reviews examining the system of least prompts.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call