Development of Neurological Emergency Simulations for Assessment: Content Evidence and Response Process
- 21 January 2021
- journal article
- research article
- Published by Springer Science and Business Media LLC in Neurocritical Care
- Vol. 35 (2), 389-396
- https://doi.org/10.1007/s12028-020-01176-y
Abstract
Objective To document two sources of validity evidence for simulation-based assessment in neurological emergencies. Background A critical aspect of education is development of evaluation techniques that assess learner’s performance in settings that reflect actual clinical practice. Simulation-based evaluation affords the opportunity to standardize evaluations but requires validation. Methods We identified topics from the Neurocritical Care Society’s Emergency Neurological Life Support (ENLS) training, cross-referenced with the American Academy of Neurology’s core clerkship curriculum. We used a modified Delphi method to develop simulations for assessment in neurocritical care. We constructed checklists of action items and communication skills, merging ENLS checklists with relevant clinical guidelines. We also utilized global rating scales, rated one (novice) through five (expert) for each case. Participants included neurology sub-interns, neurology residents, neurosurgery interns, non-neurology critical care fellows, neurocritical care fellows, and neurology attending physicians. Results Ten evaluative simulation cases were developed. To date, 64 participants have taken part in 274 evaluative simulation scenarios. The participants were very satisfied with the cases (Likert scale 1–7, not at all satisfied—very satisfied, median 7, interquartile range (IQR) 7–7), found them to be very realistic (Likert scale 1–7, not at all realistic—very realistic, median 6, IQR 6–7), and appropriately difficult (Likert scale 1–7, much too easy—much too difficult, median 4, IQR 4–5). Interrater reliability was acceptable for both checklist action items (kappa = 0.64) and global rating scales (Pearson correlation r = .70). Conclusions We demonstrated two sources of validity in ten simulation cases for assessment in neurological emergencies.Keywords
Funding Information
- American Board of Psychiatry and Neurology (Faculty Innovation in Education Award)
This publication has 36 references indexed in Scilit:
- Guidelines for the Early Management of Patients With Acute Ischemic StrokeStroke, 2013
- Guidelines for the Management of Aneurysmal Subarachnoid HemorrhageStroke, 2012
- Neurocritical care education during neurology residencyNeurology, 2012
- The Management of Encephalitis: Clinical Practice Guidelines by the Infectious Diseases Society of AmericaClinical Infectious Diseases, 2008
- Resident Exposure to Critical Patients in a Pediatric Emergency DepartmentPediatric Emergency Care, 2007
- Residents feel unprepared and unsupervised as leaders of cardiac arrest teams in teaching hospitals: A survey of internal medicine residents*Critical Care Medicine, 2007
- A pilot study using high-fidelity simulation to formally evaluate performance in the resuscitation of critically ill patients: The University of Ottawa Critical Care Medicine, High-Fidelity Simulation, and Crisis Resource Management I StudyCritical Care Medicine, 2006
- Faculty and the Observation of Trainees’ Clinical Skills: Problems and OpportunitiesAcademic Medicine, 2004
- Validity: on the meaningful interpretation of assessment dataMedical Education, 2003
- The Resident Experience on Trauma: Declining Surgical Opportunities and Career Incentives? Analysis of Data from a Large Multi-institutional StudyThe Journal of Trauma and Acute Care Surgery, 2003