Hostname: page-component-586b7cd67f-vdxz6 Total loading time: 0 Render date: 2024-11-22T22:26:42.507Z Has data issue: false hasContentIssue false

Simulation as a Teaching Method: Evaluation of the University of Minnesota Humanitarian Crisis Simulation

Published online by Cambridge University Press:  05 April 2022

Sarah Kesler*
Affiliation:
Division of Pulmonary, Allergy Critical Care and Sleep Medicine, University of Minnesota Medical School, Minneapolis, Minnesota, United States
Eric James
Affiliation:
Field Ready, Evanston, Illinois, United States
Amy Scheller
Affiliation:
University of Minnesota, Minneapolis, Minnesota, United States
Sherry Gray
Affiliation:
University of Minnesota, Minneapolis, Minnesota, United States
Len Kne
Affiliation:
U-Spatial, University of Minnesota, Minneapolis, Minnesota, United States
Brett Hendel-Paterson
Affiliation:
Global Medicine, University of Minnesota Medical School, Minneapolis, Minnesota, United States
*
Corresponding author: Sarah Kesler, Email: kesle002@umn.edu.
Rights & Permissions [Opens in a new window]

Abstract

Objective:

The University of Minnesota Crisis Humanitarian Simulation provides trans-disciplinary training in disaster response. The course directors wished to better understand the learning outcomes and experiences of simulation participants.

Methods:

The learning outcomes and experiences of participants in the 2019 simulation were assessed using 3 modalities: 1) pre-and post-simulation test, 2) participants’ self-assessment of learning, and 3) qualitative feedback via an anonymous evaluation.

Results:

Participant scores on the knowledge survey were significantly higher after the simulation than before the simulation (mean percent correct 71% vs. 48%, P < 0.0001). A significant majority of participants who completed the assessment believed they had main gains within each learning objective. Anonymous evaluations contained both positive feedback and constructive criticism leading to plans for refinements in subsequent training events.

Conclusions:

The Humanitarian Crisis Simulation is an effective experiential training program that increases participants’ knowledge in the field of disaster response. Participants also believed they had made gains in each learning objective. The authors’ analysis of elements that have contributed to the success of the program and areas for future program growth and improvement are discussed.

Type
Brief Report
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
© The Author(s), 2022. Published by Cambridge University Press on behalf of Society for Disaster Medicine and Public Health, Inc.

Introduction

The need for well-trained aid workers is increasing each year. Between 2010 and 2020, the number of people who required humanitarian aid increased from 74 million to 168 million. 1 Between 2020 and 2021, this increased by 40% to a total of 235 million. 2 Aid work requires competency over a broad set of skills. Physicians and other health care providers may find themselves particularly unprepared to function effectively given the specificity of their training. Reference Gallardo, Meneghetti and Franc3

At the University of Minnesota, 2 faculty members (SK and EJ) founded the Humanitarian Crisis Simulation training program in 2011. The mission of the program is to provide a realistic scenario that increases participants’ knowledge and skills in the field of humanitarian relief, allows participants to assess their career alignment with humanitarian response, and increases participants’ understanding of the experiences of displaced persons and aid workers. The course directors were aware of a high level of interest in disaster response among students and saw the need for training based on their personal experience.

Both course directors have worked in humanitarian disasters where inexperienced aid workers made mistakes with negative consequences. A course director had failed to promptly identify an outbreak of cholera, while the other had worked in a project where a female aid worker was abducted as she took a jog, in part related to a lack of situational awareness. The course directors felt strongly that training in disaster response should include authentic learning and experiential methods so students could comprehend and apply the transdisciplinary and demanding nature of the subject matter.

The initial event was a 24-hour long simulation of a humanitarian disaster which was intended to impart basic knowledge and skills in the field of humanitarian relief. Participants attended a brief series of lectures that covered basic concepts in humanitarianism, and then conducted a rapid assessment of a fictional region experiencing a humanitarian emergency. An equal number of medical trainees and graduate students attended. During informal debriefing sessions, participants expressed enthusiasm for the course, and a desire for more comprehensive training. As a result, the course directors reviewed existing literature on humanitarian and disaster response training programs, Reference Cranmer, Chan and Kayden4,Reference Walsh, Subbarao and Gebbie5 and enlisted a number of content experts to expand the curriculum. The curriculum now covers the Sphere and Core Humanitarian Standards, 6,7 humanitarian and human rights law, rapid assessment, security, Geographic Information Systems (GIS), leadership, disaster medicine, and psychological first aid. The course aims to achieve its mission through the inclusion of 8 learning objectives. (Supplemental Table 1: Simulation Learning Objectives)

The program delivers content through pre-simulation online material and a full-scale simulation of a humanitarian crisis in a large outdoor setting. 8 Participants in the program are adult learners from a variety of backgrounds including medicine, public health, public policy, security technology, and social work. The program is offered as a 1-credit graduate school class and a non-credit learning experience for medical trainees and external participants of all professional backgrounds. In 2019, the simulation had 49 participants; 40% of whom were health care workers. The simulation begins on a Friday afternoon and ends on a Sunday afternoon (simulation activities are suspended between the hours of 10PM and 7AM). Participants are divided into interdisciplinary teams and stay together in cabins without heat, electricity, or running water. Teams are transported to a fictional country that is experiencing a conflict-based humanitarian disaster where they must conduct a rapid assessment. The simulated country is inhabited by approximately 100 role players. These role players represent members of the local population, refugees, humanitarian aid workers, government officials and troops, sick and wounded patients, militia (including child soldiers), and UN and OCHA officials. The area is approximately 40 hectares in area, and participants must walk many miles over the course of the weekend. Quantitative information is presented via physical symbols or hand-written data. Role players provide qualitative information; the extent and accuracy of which varies according to the approach of the participants. Late Sunday morning, field activities cease. After lunch, teams present project proposals, which are verbally evaluated by faculty.

The last 2 hours of the event are spent debriefing in 3 sessions: faculty and all participants, role players and all participants, and individual teams. In the days following the event, additional debriefing sessions are held on campus based on academic program or mode of credit. These debriefings have been unstructured and typically take place over 1 to 2 hours. Course directors have collected some information in recent iterations from anonymous surveys and large group and individual debriefing sessions. Participants consistently express appreciation for the overall learning experience and realism of the event. Many participants also express a desire for more feedback and direction. There is variation between participants as to whether the simulation was overwhelming or underchallenging. Each year, role players note that participants struggle to balance treating role players humanely, and acquiring quantitative information to inform their plans for intervention. In both 2015 and 2016, an anonymous survey was administered. In each survey, nearly all respondents said their learning had increased by a moderate or high amount (90% to 95%). However, more detailed questions were not included. The course directors first administered a pre and post simulation test in 2016. Average scores increased from 48% before the simulation to 57% after the simulation (P < 0.001).

The course directors have responded to feedback by reinforcing pre-simulation material through discrete in-simulation activities, giving role players a more expansive role, increasing the time spent on debriefing, creating a more detailed scenario, and titrating the amount of challenge year to year. The course directors more formally evaluated the simulation in 2019 by piloting an evaluation process as described below.

Method of evaluation

Three modalities of evaluation were used: a 20-question multiple question pre- and post- simulation test, participants’ self-assessments of their learning, and anonymous qualitative feedback via open ended questions.

The test was created by course directors in collaboration with expert faculty. Its questions focused on basic concepts in humanitarian response, sector specific technical information, and Sphere minimum standards (Supplemental Table 2: Simulation Pre- and Post-Test). Faculty provided the questions for their specific content areas. Participants were provided with a de-identified code with which they took the pre-test online prior to accessing the material. Participants completed the post-test by hand immediately after the project proposals.

A week after the simulation, all participants were emailed a unique link to an online evaluation. Participants were sent a reminder email 2 weeks after the simulation. The online evaluation asked participants to self-assess their overall simulation learning outcomes, and whether they made gains within the program learning objectives. The online evaluation also contained 2 short answer questions: “What did you like most about the simulation?” and “What suggestions do you have to improve upon the Simulation?” These data are shown in Supplemental Table 3: Participant Qualitative Feedback. Short answer responses were coded line by line by SK and SG to identify themes and their frequency. The evaluation process was exempted from review by the University of Minnesota IRB because it was a course-related research activity

Results

Pre- and post-test

A total of 46 out of 49 participants took both tests. The average score increased from 48% before the simulation to 71% after the simulation (P < 0.0001).

Post-course self-assessment

A total of 30 out of 49 participants responded to the survey (61% response rate). Of these, 43% were health care workers. Most participants felt their knowledge of humanitarianism and the experience of refugees had increased by a moderate or high amount. Most participants expected to use what they learned in their future careers. These data are shown in Figure 1.

Figure 1. Participant assessment of overall learning experience.

Most participants somewhat or strongly agreed that they had worked toward achieving each learning objective. These data are shown in Figure 2.

Figure 2. Participant assessment of learning objective outcomes.

Anonymous qualitative feedback

Participants were asked what they liked best about the simulation, and to make suggestions for improvement. The themes that were most commonly identified by participants as positive aspects of the simulation included the realism of the experience, participants’ increase in knowledge and skills, participants’ more nuanced understanding of the reality of humanitarian aid, and the overall program management. The most common suggestions for improvement were requests for more mental health support resources and trigger warnings, clearer expectations, more time for reflection, and debriefing, more instruction and background preparation, and less emphasis on data collection (Supplemental Table 3).

Limitations

The online evaluation non-response rate was 39 percent. It is unknown whether participants’ gains in knowledge measured in the quiz were sustained. It is unknown whether participants’ self-assessments of their gains in achieving the learning objectives were accurate, as their competence was not measured. Respondents were asked open ended questions rather than asked to comment on specific themes and not all respondents provided qualitative feedback.

Discussion

Program strengths

The University of Minnesota Humanitarian Crisis Simulation, now held on 6 occasions, is an effective experiential learning program which increases participants’ knowledge in the field of disaster response. Participants especially appreciated the realism of the simulated experience, their perceived increase in knowledge and skills, and the deeper understanding they gained regarding humanitarian crisis response. The course directors have identified 5 factors that have contributed to the program’s success, including high fidelity, active simulation management, feedback and debriefing, iterative design, and high-quality program management.

High fidelity simulation

The simulation contains scenarios common to humanitarian disasters, including high rates of malnutrition and infectious diseases, limited resources, and high levels of insecurity. However, the location and context are midwestern, which makes it easier for participants to suspend disbelief. Role players who have direct experience in humanitarian crises are asked to adapt their characters to incorporate their personal experience. Professional actors play key characters, direct activities in their location, and provide real time coaching to other role players.

Active simulation management

Management of a complicated simulation is difficult, yet vitally important. Good management of a simulation has been shown to “significantly influence the quality of learning and the ability of translating that learning into real-life performance improvement.” Reference Wenzler9 The simulation follows a story arc with multiple independent and interrelated events which are designed to reinforce the pre-simulation material and prompt participants to actively participate in the simulation. A storyboard provides a general schedule (see Supplemental Table 4), but the specifics of timing are adjusted based on real time observations. Participants respond to events by completing assignments, participating in exercises, and attending meetings. Role players, runners, and faculty, coordinate specific timing of events via mobile handsets (walkie-talkies) and group text messages. Teams are monitored to ensure they participate in each exercise and complete their assignments. The course directors have learned over the years that bad weather, technical difficulties, and participant gaming can derail the most comprehensive planning. Examples of unscripted events include participants developing psychosis, role players taking hostages, and entire refugee camps disbanding or moving to other locations. These unplanned events have, at times, required significant improvisation on the part of course directors.

Iterative design process

As outlined above, the course directors modify the course each year based on their observations and feedback from participants and role players. More content is added each year and simulation activities are now more structured to reinforce pre-simulation online content. The simulation also now places more emphasis on simulation elements that challenge participants to practice skills such as communication, collaboration, time management, professionalism, and emotional self-regulation. Course directors have made the simulation world more complex and detailed each year. The program also now provides more thorough debriefings, on-site mental health support, and more direction to faculty members.

Feedback and debriefing

Feedback and debriefing are widely considered to be 2 of the most important elements of experiential learning programs. Reference Barry Issenberg, Mcgaghie, Petrusa, Lee Gordon and Scalese10 Faculty members provide verbal feedback on team project proposals. Immediately after the simulation, role players provide feedback on whether participants correctly identified relevant qualitative information, and on how well they related to role players. The joint debriefing sessions are described by both role players and participants as highly impactful. The amount of time devoted to post-event debriefing sessions has increased each year. In 2019, a structured debriefing process was piloted with a small group of public health students.

High quality program management

A dedicated project manager handles communications, finances, registration, and logistics before, during, and after the simulation, and requires a substantial time commitment. This course was supported with a 0.3 administrative full time equivalent. A number of staff and volunteers attend and provide support during the weekend event. Ensuring volunteer role players and faculty have a good experience is a contributor to quality of the program as it has resulted in faculty and role players who contribute their expertise to the curriculum. The simulation has been modified to respond to their feedback. Role players and faculty are provided with shelter, food, opportunities for breaks, and activities when not interacting with participants (e.g., games, role playing exercises, and maintaining campfires) during the field exercise. Faculty and role players are encouraged to draw on their personal experience and be creative during the simulation. The large group-debrief with participants and role players was also implemented as a response to role player feedback. Repeat volunteers and faculty have added value to the simulation by adding elements specific to their expertise, and they have also served as ambassadors, promoting the simulation in the community.

Areas for future program growth and improvement

Learning outcomes

The program continues to have opportunities to increase participant knowledge. Course directors plan to make the pre-simulation material more interactive and further reinforce the material with in-simulation activities. Participants’ self-assessments of their achievement of learning objectives also demonstrated room for growth. The course directors continue to work to create opportunities for participants to practice skills associated with each learning objective.

Qualitative feedback

Participants continue to request more formal instruction, as well as more feedback and debriefing. The course directors had previously attempted to respond to this feedback as detailed above. It is possible that the ambiguous and experiential format of the simulation, as well as its short duration, will never provide enough concrete instruction and feedback to satisfy all participants. However, they plan to provide written feedback on team proposals in future events and expand the structured debriefing process to all participants.

The course directors were also concerned about the frequency of requests for more trigger warnings and mental health support. The role of the militia was slightly different in the most recent iteration and will continue to be modified. Trigger warnings will be given more emphasis in future iterations, as well as further dedicated instruction on psychological self-care. On-site mental health professionals will continue to be available in the simulation and will be more clearly identified with specific uniforms.

The course directors continue to strive for the optimal balance between challenge and time for reflection, as well as the balance between emphasizing qualitative and quantitative data collection. The number of assignments and distractions will be reduced in the next iteration to allow more time for reflection. They also intend to add instruction on qualitative interviewing to provide scaffolding for participants when they interact with role players.

Conclusion

The Humanitarian Simulation has now been held on 6 occasions. Results from the 2019 post-course evaluation showed that participants’ knowledge increased because of their participation. Additionally, a large majority of participants self-reported increases in knowledge, skills, and a deeper understanding of humanitarian aid because of their participation in the simulation. More information is needed to evaluate whether the simulation increases participants’ competence or impacts their future career choices. The course directors will continue to incorporate lessons learned to improve the quality and experience of the program.

Supplementary material

To view supplementary material for this article, please visit https://doi.org/10.1017/dmp.2022.28

Author contributions

SK created and directed the course and wrote the manuscript.

EJ created and directed the course and contributed to the manuscript, while

AS created the online curriculum, created evaluations, and reviewed the manuscript.

SG created the curriculum, attended debriefing sessions, analyzed evaluation results, and also reviewed the manuscript, while

LK and BHP created the curriculum, attended debriefing sessions, and reviewed the manuscript.

References

World ReliefWeb. Global Humanitarian Overview 2020 [EN/AR/FR/ZH]. https://reliefweb.int/report/world/global-humanitarian-overview-2020-enarfrzh Google Scholar
World ReliefWeb. Global Humanitarian Overview 2021 [EN/AR/FR/ZH]. https://reliefweb.int/report/world/global-humanitarian-overview-2021-enarfres Google Scholar
Gallardo, AR, Meneghetti, G, Franc, JM, et al. Comparing resource management skills in a high- versus low-resource simulation scenario: a pilot study. Prehosp Disaster Med. 2020;35(1):83-87. doi: 10.1017/S1049023X19005107 CrossRefGoogle Scholar
Cranmer, H, Chan, JL, Kayden, S, et al. Development of an evaluation framework suitable for assessing humanitarian workforce competencies during crisis simulation exercises. Prehosp Disaster Med. 2014;29(1):69-74. doi: 10.1017/S1049023X13009217 CrossRefGoogle ScholarPubMed
Walsh, L, Subbarao, I, Gebbie, K, et al. Core competencies for disaster medicine and public health. Disaster Med Public Health Prep. 2012;6(1):44-52. doi: 10.1001/dmp.2012.4 CrossRefGoogle ScholarPubMed
The Sphere Handbook. Sphere standards; 2018. https://spherestandards.org/handbook-2018/ Google Scholar
YMCA Camp St. Croix. YMCA of the North. https://www.ymcanorth.org/camps/camp_st_croix.Google Scholar
Wenzler, I. The ten commandments for translating simulation results into real-life performance. Simul Gaming. 2008;40(1):98-109. doi: 10.1177/1046878107308077 CrossRefGoogle Scholar
Barry Issenberg, S, Mcgaghie, WC, Petrusa, ER, Lee Gordon, D, Scalese, RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005;27(1):10-28. doi: 10.1080/01421590500046924 CrossRefGoogle Scholar
Figure 0

Figure 1. Participant assessment of overall learning experience.

Figure 1

Figure 2. Participant assessment of learning objective outcomes.

Supplementary material: File

Kesler et al. supplementary material

Kesler et al. supplementary material 1

Download Kesler et al. supplementary material(File)
File 19.4 KB
Supplementary material: File

Kesler et al. supplementary material

Kesler et al. supplementary material 2

Download Kesler et al. supplementary material(File)
File 23.5 KB
Supplementary material: File

Kesler et al. supplementary material

Kesler et al. supplementary material 3

Download Kesler et al. supplementary material(File)
File 20.9 KB
Supplementary material: File

Kesler et al. supplementary material

Kesler et al. supplementary material 4

Download Kesler et al. supplementary material(File)
File 14.3 KB