A Commentary
- 6 December 2012
- journal article
- research article
- Published by SAGE Publications in Remedial and Special Education
- Vol. 34 (1), 39-43
- https://doi.org/10.1177/0741932512468038
Abstract
The What Works Clearinghouse document on single-case experimental designs is critiqued. Three criticisms are offered. First, between-group experimental concepts and terminology unfortunately seeped into the description of single-case experimental designs; examples and potential negative effects are identified. Second, the document contains a number of critical omissions; four of these are noted. Third, the document contains some “errors” or misconceptions; examples are provided. Despite these criticisms, the document is generally accurate and may become quite useful for individuals reviewing single-case studies when they conduct synthesis of research literature.Keywords
This publication has 10 references indexed in Scilit:
- Embracing Our Visual Inspection and Analysis TraditionRemedial and Special Education, 2010
- Enhancing the scientific credibility of single-case intervention research: Randomization to the rescue.Psychological Methods, 2010
- TREATMENT INTEGRITY OF SCHOOL‐BASED INTERVENTIONS WITH CHILDREN IN THE JOURNAL OF APPLIED BEHAVIOR ANALYSIS 1991–2005Journal of Applied Behavior Analysis, 2007
- Describing Baseline Conditions: Suggestions for Study ReportsJournal of Behavioral Education, 2006
- Effects of two levels of procedural fidelity with constant time delay on children's learningJournal of Behavioral Education, 1994
- External validity and experimental investigation of individual behaviourAnalysis and Intervention in Developmental Disabilities, 1981
- MULTIPLE‐PROBE TECHNIQUE: A VARIATION OF THE MULTIPLE BASELINE1Journal of Applied Behavior Analysis, 1978
- THE CHANGING CRITERION DESIGNJournal of Applied Behavior Analysis, 1976
- Believability When N = 1The Psychological Record, 1974
- SOME CURRENT DIMENSIONS OF APPLIED BEHAVIOR ANALYSIS1Journal of Applied Behavior Analysis, 1968