Reliability of a Core Competency Checklist Assessment in the Emergency Department: The Standardized Direct Observation Assessment Tool

Abstract
A Council of Emergency Medicine Residency Directors task force developed the Standardized Direct Observation Assessment Tool (SDOT), a 26-item checklist assessment tool to evaluate Accreditation Council for Graduate Medical Education resident core competencies by direct observation. Each of the checklist items is assigned to one or more of five core competencies. The objective of this study was to test the interrater measurement properties of the SDOT instrument. Two videos of simulated patient-resident-attending physician encounters were produced. Academic emergency medicine faculty members not involved in the development of the form viewed the two encounters and completed the SDOT for each. Faculty demographic data were collected. Data were collected from 82 faculty members at 16 emergency medicine residency programs. The checklist items were used to generate a composite score for each core competency of patient care, medical knowledge, interpersonal and communication skills, professionalism, and systems-based practice. Univariate analysis demonstrated a high degree of agreement between evaluators in evaluating residents for both videos. Multivariate analysis found no differences in rating by faculty when examined by experience, academic title, site, or previous use of the SDOT. Faculty from 16 emergency medicine residency programs had a high interrater agreement when using the SDOT to evaluate resident core competency performance. This study did not test the validity of the tool. This data analysis is mainly descriptive, and scripted video scenarios may not approximate direct observation in the emergency department.