New Search

Advanced search
Export article
Open Access

Reliability and validity of scenario-specific versus generic simulation assessment rubrics

Sciprofile linkMarian Luctkar-Flude, Deborah Tregunno, Kim Sears, Cheryl Pulling, Kayla Lee, Rylan Egan
Journal of Nursing Education and Practice , Volume 10; doi:10.5430/jnep.v10n8p74

Abstract: Background: This study assessed reliability and validity of scenario-specific and generic simulation assessment rubrics used in two different deteriorating patient simulations, and explored learner and instructor preferences.Methods: Learner performance was rated independently by three instructors using two rubrics.Results: A convenience sample of 29 nursing students was recruited. Inter-rater reliability was similar but slightly higher for the generic rubric than the scenario-specific learning outcomes assessment rubric (ICC = .759 vs .748 and IRR = .693 vs .641) for two different scenarios. Most students found the scenario-specific rubric more helpful to their learning (59%), and easier to use (52%). Instructors (3/3) found the scenario-specific rubric more helpful to guide debriefing.Conclusions: Scenario-specific rubrics may be more valuable for learners to help them identify their own knowledge and performance gaps and assist them in their preparation for simulation. Additionally, scenario-specific rubrics provide direction for both learners and instructors during debriefing sessions.
Keywords: Simulation / validity / Instructors / Learners / Helpful / specific rubrics / scenario specific rubric

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

Share this article

Click here to see the statistics on "Journal of Nursing Education and Practice" .
Back to Top Top