Do coder characteristics influence validity of ICD-10 hospital discharge data?
Open Access
- 21 April 2010
- journal article
- research article
- Published by Springer Science and Business Media LLC in BMC Health Services Research
- Vol. 10 (1), 99
- https://doi.org/10.1186/1472-6963-10-99
Abstract
Background Administrative data are widely used to study health systems and make important health policy decisions. Yet little is known about the influence of coder characteristics on administrative data validity in these studies. Our goal was to describe the relationship between several measures of validity in coded hospital discharge data and 1) coders' volume of coding (≥13,000 vs. <13,000 records), 2) coders' employment status (full- vs. part-time), and 3) hospital type. Methods This descriptive study examined 6 indicators of face validity in ICD-10 coded discharge records from 4 hospitals in Calgary, Canada between April 2002 and March 2007. Specifically, mean number of coded diagnoses, procedures, complications, Z-codes, and codes ending in 8 or 9 were compared by coding volume and employment status, as well as hospital type. The mean number of diagnoses was also compared across coder characteristics for 6 major conditions of varying complexity. Next, kappa statistics were computed to assess agreement between discharge data and linked chart data reabstracted by nursing chart reviewers. Kappas were compared across coder characteristics. Results 422,618 discharge records were coded by 59 coders during the study period. The mean number of diagnoses per record decreased from 5.2 in 2002/2003 to 3.9 in 2006/2007, while the number of records coded annually increased from 69,613 to 102,842. Coders at the tertiary hospital coded the most diagnoses (5.0 compared with 3.9 and 3.8 at other sites). There was no variation by coder or site characteristics for any other face validity indicator. The mean number of diagnoses increased from 1.5 to 7.9 with increasing complexity of the major diagnosis, but did not vary with coder characteristics. Agreement (kappa) between coded data and chart review did not show any consistent pattern with respect to coder characteristics. Conclusions This large study suggests that coder characteristics do not influence the validity of hospital discharge data. Other jurisdictions might benefit from implementing similar employment programs to ours, e.g.: a requirement for a 2-year college training program, a single management structure across sites, and rotation of coders between sites. Limitations include few coder characteristics available for study due to privacy concerns.Keywords
This publication has 27 references indexed in Scilit:
- Measuring agreement of administrative data with chart data using prevalence unadjusted and adjusted kappaBMC Medical Research Methodology, 2009
- Assessing Validity of ICD‐9‐CM and ICD‐10 Administrative Data in Recording Clinical Conditions in a Unique Dually Coded DatabaseHealth Services Research, 2008
- Reliability of diagnoses coding with ICD-10International Journal of Medical Informatics, 2006
- Measuring Diagnoses: ICD Code AccuracyHealth Services Research, 2005
- Assessing accuracy of diagnosis-type indicators for flagging complications in administrative dataJournal of Clinical Epidemiology, 2004
- Disease classification: measuring the effect of the Tenth Revision of the International Classification of Diseases on cause‐of‐death data in the United StatesStatistics in Medicine, 2003
- Use of Administrative Data to Find Substandard CareMedical Care, 2000
- Validation of a combined comorbidity indexJournal of Clinical Epidemiology, 1994
- A new method of classifying prognostic comorbidity in longitudinal studies: Development and validationJournal of Chronic Diseases, 1987
- A Coefficient of Agreement for Nominal ScalesEducational and Psychological Measurement, 1960