How Many Studies Do You Need?
Top Cited Papers
- 1 April 2010
- journal article
- research article
- Published by American Educational Research Association (AERA) in Journal of Educational and Behavioral Statistics
- Vol. 35 (2), 215-247
- https://doi.org/10.3102/1076998609346961
Abstract
In this article, the authors outline methods for using fixed and random effects power analysis in the context of meta-analysis. Like statistical power analysis for primary studies, power analysis for meta-analysis can be done either prospectively or retrospectively and requires assumptions about parameters that are unknown. The authors provide some suggestions for thinking about these parameters, in particular for the random effects variance component. The authors also show how the typically uninformative retrospective power analysis can be made more informative. The authors then discuss the value of confidence intervals, show how they could be used in addition to or instead of retrospective power analysis, and also demonstrate that confidence intervals can convey information more effectively in some situations than power analyses alone. Finally, the authors take up the question “How many studies do you need to do a meta-analysis?” and show that, given the need for a conclusion, the answer is “two studies,” because all other synthesis techniques are less transparent and/or are less likely to be valid. For systematic reviewers who choose not to conduct a quantitative synthesis, the authors provide suggestions for both highlighting the current limitations in the research base and for displaying the characteristics and results of studies that were found to meet inclusion criteria.Keywords
This publication has 22 references indexed in Scilit:
- The Search for Meaningful Ways to Express the Effects of InterventionsChild Development Perspectives, 2008
- Intraclass Correlation Values for Planning Group-Randomized Trials in EducationEducational Evaluation and Policy Analysis, 2007
- Cognitive-behavioural training interventions for assisting foster carers in the management of difficult behaviourPublished by Wiley ,2005
- Measuring inconsistency in meta-analysesBMJ, 2003
- Applied Psychology: Editorial.Journal of Applied Psychology, 2003
- How meta-analysis increases statistical power.Psychological Methods, 2003
- Quantifying heterogeneity in a meta-analysisStatistics in Medicine, 2002
- The Abuse of PowerThe American Statistician, 2001
- The power of statistical tests in meta-analysis.Psychological Methods, 2001
- Research news and Comment: Reflections on Statistical and Substantive Significance, With a Slice of ReplicationEducational Researcher, 1997