Abstract
Meta-analysis collects and synthesizes results from individual studies to estimate an overall effect size. If published studies are chosen, say through a literature review, then an inherent selection bias may arise, because, for example, studies may tend to be published more readily if they are statistically significant, or deemed to be more “interesting” in terms of the impact of their outcomes. We develop a simple rank-based data augmentation technique, formalizing the use of funnel plots, to estimate and adjust for the numbers and outcomes of missing studies. Several nonparametric estimators are proposed for the number of missing studies, and their properties are developed analytically and through simulations. We apply the method to simulated and epidemiological datasets and show that it is both effective and consistent with other criteria in the literature.