Development and Evaluation of a Pedagogical Tool to Improve Understanding of a Quality Checklist: A Randomised Controlled Trial
Open Access
- 4 May 2007
- journal article
- research article
- Published by Public Library of Science (PLoS) in PLoS Clinical Trials
- Vol. 2 (5), e22
- https://doi.org/10.1371/journal.pctr.0020022
Abstract
The aim of this study was to develop and evaluate a pedagogical tool to enhance the understanding of a checklist that evaluates reports of nonpharmacological trials (CLEAR NPT). Paired randomised controlled trial. Clinicians and systematic reviewers. We developed an Internet-based computer learning system (ICLS). This pedagogical tool used many examples from published randomised controlled trials to demonstrate the main coding difficulties encountered when using this checklist. Randomised participants received either a specific Web-based training with the ICLS (intervention group) or no specific training. The primary outcome was the rate of correct answers compared to a criterion standard for coding a report of randomised controlled trials with the CLEAR NPT. Between April and June 2006, 78 participants were randomly assigned to receive training with the ICLS (39) or no training (39). Participants trained by the ICLS did not differ from the control group in performance on the CLEAR NPT. The mean paired difference and corresponding 95% confidence interval was 0.5 (−5.1 to 6.1). The rate of correct answers did not differ between the two groups regardless of the CLEAR NPT item. Combining both groups, the rate of correct answers was high or items related to allocation sequence (79.5%), description of the intervention (82.0%), blinding of patients (79.5%), and follow-up schedule (83.3%). The rate of correct answers was low for items related to allocation concealment (46.1%), co-interventions (30.3%), blinding of outcome assessors (53.8%), specific measures to avoid ascertainment bias (28.6%), and intention-to-treat analysis (60.2%). Although we showed no difference in effect between the intervention and control groups, our results highlight the gap in knowledge and urgency for education on important aspects of trial conduct. Controlled-Trials.com ISRCTN07698599Keywords
This publication has 27 references indexed in Scilit:
- Assessment of methodological quality of primary studies by systematic reviews: results of the metaquality cross sectional studyBMJ, 2005
- Correlation of Quality Measures With Estimates of Treatment Effect in Meta-analyses of Randomized Controlled TrialsJAMA, 2002
- The influence of methodologic quality on the conclusion of a landmark meta-analysis on thrombolytic therapy.2002
- Improving the quality of reports of meta-analyses of randomised controlled trials: the QUOROM statementThe Lancet, 1999
- The Delphi List: A Criteria List for Quality Assessment of Randomized Clinical Trials for Conducting Systematic Reviews Developed by Delphi ConsensusJournal of Clinical Epidemiology, 1998
- Does quality of reports of randomised trials affect estimates of intervention efficacy reported in meta-analyses?The Lancet, 1998
- CONSORT: An Evolving Tool to Help Improve the Quality of Reports of Randomized Controlled TrialsJAMA, 1998
- Blinding and exclusions after allocation in randomised controlled trials: survey of published parallel group trials in obstetrics and gynaecologyBMJ, 1996
- Assessing the quality of randomized controlled trials: An annotated bibliography of scales and checklistsControlled Clinical Trials, 1995
- Empirical evidence of bias. Dimensions of methodological quality associated with estimates of treatment effects in controlled trialsJAMA, 1995