Hostname: page-component-586b7cd67f-t8hqh Total loading time: 0 Render date: 2024-11-23T08:18:30.611Z Has data issue: false hasContentIssue false

Queen’s University Emergency Medicine Simulation OSCE: an Advance in Competency-Based Assessment

Published online by Cambridge University Press:  20 May 2015

Carly Margaret Hagel
Affiliation:
Department of Emergency Medicine, Queen’s University, Kingston, ON.
Andrew Koch Hall
Affiliation:
Department of Emergency Medicine, Queen’s University, Kingston, ON.
Jeffrey Damon Dagnone*
Affiliation:
Department of Emergency Medicine, Queen’s University, Kingston, ON.
*
Correspondence to: J. Damon Dagnone, Department of Emergency Medicine, Queen’s University, Kingston General Hospital, Empire 3, 76 Stuart Street, Kingston, Ontario, K7L2V7; Email: damondagnone@hotmail.com

Abstract

Type
Brief Educational Report
Copyright
Copyright © Canadian Association of Emergency Physicians 2015 

Introduction

The use of high-fidelity simulation in emergency medicine (EM) training programs is increasing at a rapid pace.Reference Okuda, Bond and Bonfante 1 There is abundant evidence supporting the use of simulation-based training for medical educationReference McGaghie, Issenberg and Cohen 2 and a growing body of literature demonstrating that simulation is a valid tool for competency assessment.Reference Cook, Brydges and Zendejas 3 Simulation has the potential to satisfy the need for valid competency-based assessment toolsReference McGaghie, Issenberg and Petrusa 4 as clinical expertise assessment moves away from traditional knowledge-based examination and towards competency-based methods. 5 We describe the development and experience of the Queen’s University EM post-graduate training program with a simulation-based Objective Structured Clinical Examination (OSCE) for resuscitation and procedural skill competency assessment.

Description of Innovation

Rationale

There has been a broad call for the development of tools to assess competency of post-graduate medical trainees in the United States,Reference Steadman and Huang 6 Canada, 5 and internationally.Reference Amin, Boulet and Cook 7 , Reference Hamstra 8 As the Royal College of Physicians and Surgeons (RCPS) moves toward the implementation of competency-based medical education curricula for postgraduate training,Reference Frank, Snell and Sherbino 9 the need for tools to accurately assess clinical competencies has become paramount. Traditional knowledge-based examinations (written tests and oral exams) test trainees at a “knows how” level.Reference Miller 10 Direct evaluation of performance through simulation-based assessment provides an opportunity for simultaneous evaluation of knowledge, clinical reasoning, communication, and teamwork.Reference Epstein 11 We launched a simulation-based OSCE in 2008 to answer the call for the development and implementation of competency assessments at the “shows how” level of Miller’s pyramid.Reference Boursicot, Etheridge and Setna 12

Trainees and faculty

The Queen’s University post-graduate EM program oversees 30 resident trainees per year; 20 Royal College of Physicians and Surgeons of Canada EM candidates (RCPS-EM) and 10 College of Family Physicians of Canada EM candidates (CFPC-EM). All RCPS-EM and CFPC-EM residents are required to participate bi-annually in a standardized OSCE held in our clinical simulation centre (CSC). Two specific faculty members with training in medical education are primarily responsible for the delivery of the simulation-based OSCE assessments, while the RCPS-EM and CFPC-EM program directors assist with debriefing trainees following their performances.

Assessment framework

All residents are presented with two or three resuscitation scenarios in sequential order and debriefed for up to 30 minutes by a faculty member immediately following their performance. The debriefing focuses on strengths, weaknesses, and any need for improvement of each individual trainee.

Faculty debriefers use a standardized Queen’s Simulation Assessment Tool (QSAT) for the resuscitation scenarios (Figure 1). This tool has been evolving since the inception of the assessment program and has been validated both locally and in a multicentre setting.Reference Dagnone, Hall and Woolfrey 13 , Reference Hall, Dagnone and Lacroix 14 The QSAT was designed as a hybrid scoring tool with four anchored domain scores (primary assessment, diagnostic actions, therapeutic actions, and communication) and an overall performance score (global assessment). Within each domain, key anchors are included, which may be modified to assist the determinations of expert evaluators related to assessment scores. A modified Delphi technique by EM and critical care physicians was used to modify the generic QSAT for each standardized scenario. The QSAT is easily modifiable for use across multiple resuscitation scenarios and remains an ongoing research focus within our program.Reference Hall, Dagnone and Lacroix 14 , Reference Hall, Pickett and Dagnone 15

Figure 1 Sample of the Queen’s Simulation Assessment Tool (QSAT) used to assess resident performance in an OSCE station and guide feedback during debriefing.

Scenarios

OSCE scenarios are carried out in a single room, designed as a simulated resuscitation bay, using a high-fidelity mannequin. All scenarios are run by a simulation-trained faculty member and an experienced simulation laboratory technician, and involve roles for the mannequin operator and standardized actors portraying a nurse and a respiratory therapist. The scenarios are pre-scripted and delivered in a standardized fashion to all residents. All resident performances are video-recorded using Kb Port ETC Pro (Kb Port, Allison Park, PA), a 3-camera system, and a cardiac monitor, with the audio recording included.

The scenarios utilized were designed, implemented, and debriefed by EM faculty with training in simulation-based instruction. The two faculty members primarily responsible for this program have 15 combined years of experience in simulation-based instruction and assessment, and Master’s degrees in medical education. One of the faculty members has also obtained additional training in simulation at Harvard University (Boston, USA). The scenarios include cardinal presentations, such as unstable arrhythmia, cardiac arrest, bradycardia, decreased level of consciousness, and hypotension. A blueprinting exercise was performed, mapping against the 7th edition of Rosen’s Emergency Medicine textbookReference Marx, Hockberger and Walls 16 to ensure a proper distribution of resuscitation cases and the assessment of multiple core competencies within EM. The standardized scenarios were designed to discriminate between residents of different levels of training by eliciting observable behaviours that traditionally are best assessed in the actual clinical environment. In a validation study of the QSAT, compiling data from sequential OSCE examinations, nine of 10 scenarios were found to have statistically significant differences in assessment scores, and thus discriminating ability, between junior and senior RCPS-EM trainees.Reference Hall, Dagnone and Lacroix 14

Debriefing and self-assessment

Faculty debriefing occurs in an adjacent private conference room after each resident has completed sequential performance of all scenarios. Performance expectations are based on resident’s post-graduate level. Trainees receive a score for each domain on the QSAT, a total score for all the domains combined, and a global assessment score for each scenario performance. QSAT scoring sheets are placed in each trainee’s portfolio as a formative objective assessment measure of their progression and are discussed during quarterly reviews with the program director.

All video-recorded performances are uploaded onto a secure server, and linked to each trainee on a Queen’s University online MoodleTM platform (Figure 2). Each resident can access only their own video recordings using a login and password. Residents are also provided with a self-assessment scoring tool online in PDF form, which can be downloaded and utilized electronically. By allowing access to the video recording of their own performance and the opportunity to meet with faculty at a later date, all residents are provided a means for further formative feedback after delayed reflection and personal assessment.

Figure 2 A screen capture of the video recording made during the OSCE that residents access following the OSCE.

Resident perspective

By the end of their residency, Queen’s University RCSP-EM residents will have participated in 10 simulation-based OSCEs and will have logged over 150 hours in the simulation laboratory by participating in the simulation-based resuscitation curriculum at Queen’s University.Reference Dagnone, Mcgraw and Howes 17 Similarly, Queen’s University CFPC-EM residents will have participated in two simulation-based OSCEs and will have logged over 30 hours in the simulation laboratory. Simulation-based examinations are a valuable tool for the assessment of individual clinical skill acquisition by faculty at our centre, but are of equal importance to trainees, to gauge improvement and allow reflection on their progress and competency development.

Summary

The Queen’s University EM post-graduate simulation-based OSCE is an innovation at the forefront of competency-based medical education. Since its implementation, the use of high-fidelity simulation and medically diverse scenarios to assess the clinical skills and competence of all RCPS-EM and CFPC-EM residents at our centre has allowed both faculty educators and resident trainees to gauge clinical performance and improvement over time. The development of a modifiable assessment tool has been essential to the success of our simulation-based OSCE. Simulation-based examinations will be an asset as we train future generations of competent physicians.

Competing Interests: None to declare.

References

1. Okuda, Y, Bond, W, Bonfante, G, et al. National growth in simulation training within emergency medicine residency programs, 2003-2008. Acad Emerg Med 2008;15(11):1113-1116.CrossRefGoogle ScholarPubMed
2. McGaghie, WC, Issenberg, SB, Cohen, ER, et al. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med 2011;86(6):706-711.CrossRefGoogle ScholarPubMed
3. Cook, DA, Brydges, R, Zendejas, B, et al. Technology-enhanced simulation to assess health professionals: a systematic review of validity evidence, research methods, and reporting quality. Acad Med 2013;88(6):872-883.CrossRefGoogle ScholarPubMed
4. McGaghie, WC, Issenberg, SB, Petrusa, ER, et al. A critical review of simulation-based medical education research: 2003-2009. Med Educ 2010;44(1):50-63.CrossRefGoogle ScholarPubMed
5. The Future of Medical Education in Canada: A Collective Vision for Postgraduate Medical Education. Ottawa: Association of Faculties of Medicine of Canada; 2012.Google Scholar
6. Steadman, RH, Huang, YM. Simulation for quality assurance in training, credentialing and maintenance of certification. Best Pract Res Clin Anaesthesiol 2012;26(1):3-15.CrossRefGoogle ScholarPubMed
7. Amin, Z, Boulet, JR, Cook, DA, et al. Technology-enabled assessment of health professions education: consensus statement and recommendations from the Ottawa 2010 Conference. Med Teach 2011;33(5):364-369.CrossRefGoogle ScholarPubMed
8. Hamstra, SJ. Keynote address: the focus on competencies and individual learner assessment as emerging themes in medical education research. Acad Emerg Med 2012;19(12):1336-1343.CrossRefGoogle ScholarPubMed
9. Frank, JR, Snell, LS, Sherbino, J. eds. Draft CanMEDS 2015 Physician Competency Framework - Series III. Ottawa: The Royal College of Physician and Surgeons of Canada; 2014, available at: http://www.royalcollege.ca/portal/page/portal/rc/common/documents/canmeds/framework/canmeds2015_framework_series_III_e.pdf.Google Scholar
10. Miller, GE. The assessment of clinical skills/competence/performance. Acad Med 1990;65(9 Suppl):S63-S67.CrossRefGoogle ScholarPubMed
11. Epstein, RM. Assessment in medical education. N Engl J Med 2007;356(4):387-396.CrossRefGoogle ScholarPubMed
12. Boursicot, K, Etheridge, L, Setna, Z, et al. Performance in assessment: consensus statement and recommendations from the Ottawa conference. Med Teach 2011;33(5):370-383.CrossRefGoogle ScholarPubMed
13. Dagnone, JD, Hall, AK, Woolfrey, K, et al. QSAT - Validation of a Competency-based Resuscitation Assessment Tool - A Canadian Multi-Centred Study. Ottawa: Canadian Association of Emergency Physicians National Conference; June 2014.Google Scholar
14. Hall, AK, Dagnone, JD, Lacroix, LL, et al. Queen’s Simulation Assessment Tool (QSAT): Development and Validation of an Assessment Tool for Resuscitation OSCE Stations in Emergency Medicine. Simul Healthc 2015;10(2):98-105.CrossRefGoogle Scholar
15. Hall, AK, Pickett, W, Dagnone, JD. Development and evaluation of a simulation-based resuscitation scenario assessment tool for emergency medicine residents. CJEM 2012;14(3):139-146.CrossRefGoogle ScholarPubMed
16. Marx, JA, Hockberger, RS, Walls, RM, et al. Rosen’s Emergency Medicine: Concepts and Clinical Practice, 7th ed. Philadelphia: Mosby Elsevier; 2009.Google Scholar
17. Dagnone, JD, Mcgraw, R, Howes, D, et al. How we developed a comprehensive resuscitation-based simulation curriculum in emergency medicine. Med Teach 2015; epub, 1-6, doi:10.3109/0142159X.2014.976187.Google Scholar
Figure 0

Figure 1 Sample of the Queen’s Simulation Assessment Tool (QSAT) used to assess resident performance in an OSCE station and guide feedback during debriefing.

Figure 1

Figure 2 A screen capture of the video recording made during the OSCE that residents access following the OSCE.