Error rates in clinical radiotherapy.

Abstract
PURPOSE Error rates in clinical oncology are undergoing increasing scrutiny. The purpose of this study was to understand error frequency, error patterns, underlying causal links, consequences, and possible prevention strategies in clinical radiotherapy. PATIENTS AND METHODS Treatment information, self-reported error documentation, and retrospective analyses of electronic treatment verification transcripts for 1,925 consecutive patients treated with a total of 93,332 individual radiotherapy fields were reviewed and analyzed. RESULTS A total of 59 separate errors that affected 168 individual treatment fields were detected, which yielded a crude radiation delivery error rate of 0.18%. All 59 errors were judged to be level I (negligible chance of adverse medical outcome) with the most common error category being a minor treatment field block misplacement. A comprehensive quality assurance program and an electronic record-and-verify linear accelerator interlock system seem to have prevented the occurrence of many additional errors. However, nine of the 59 errors were directly related to the use of this system and generally involved the transposition of similar numbers within series of treatment coordinate data-sets. Overall, radiotherapy error rates favorably compare with reported error rates for pharmaceutical administration in large tertiary care hospitals. CONCLUSION When modern automated error-minimization methods are used along with nonpunitive error reporting systems, clinical radiotherapy seems to be highly safe. Formal error analysis studies may allow the rational design of prevention strategies that are attuned to the frequency, seriousness, and antecedent causes of many classes of potential radiotherapy errors.