Monthly Archives: February 2015

Scenario Design – 2/27/15

Photo Credit: Umpqua via Compfight cc

Photo Credit: Umpqua via Compfight cc

Comparative effectiveness of instructional design features in simulation-based education: Systematic review and meta-analysis. DAVID A. COOK. Medical Teacher. 2013; 35: e867–e898

Simulation in healthcare education: A best evidence practical guide. AMEE Guide No. 82 IVETTE MOTOLA. Medical Teacher. 2013; 35: e1511–e1530
This best evidence practical guide for simulation in healthcare education is highly recommended reading for educators. It is part 2 of the Guide, and is focused on “the educational principles that lead to effective learning.” Curriculum integration, feedback and debriefing, deliberate practice, mastery learning, capturing clinical variation, individualized learning, and finally approaches to team training are discussed. Each topic is defined and explored in relation to its effective use in simulation with practical implementation points and challenges that may be encountered


Developing High-Fidelity Health Care Simulation Scenarios: A Guide for Educators and Professionals Guillaume Alinier. Simulation & Gaming
2011. 42(1) 9–26

This is a conversational piece on important considerations in scenario design that concludes with a sample scenario design template. It’s a good article for prompting new simulation educators (or just disorganized people like myself (-:) to think through how we build scenarios to address specific objectives and challenge learners in specific ways. Figure 1 depicts general courses of action and prompts the educator to think about actions or conditions that result in a change in patient condition as the scenario unfolds. It’s a good synthesizing graphic.
The section on “Preparing Scenarios in an Organized Manner” is particularly helpful as a practical guide to planning a smooth running scenario.  The Salas article (see SMARTER 2 articles down) and the TEACH Sim article resonate better with me and reflect the human factors and observational assessment orientation, but this is easily digestible and says much of the same stuff in different ways.

The Template of Events for Applied and Critical Healthcare Simulation (TEACH Sim) A Tool for Systematic Simulation Scenario Design. Benishek & Salas et al. Sim Healthcare 10:21-30, 2015.

The authors review and compare five existing scenario design templates and then feature their own design TEACH Sim. This is a great evidence- based tool for designing simulation scenarios. In fact, we were so impressed that we have already started using it (with permission from Eduardo Salas, PhD). It has a logical flow that is easy to understand and with the addition of our own simulation operation flow chart for high fidelity scenarios includes all the components for effective scenario design. Instructions for their template are also included.

 A Measurement Tool for Simulation-Based Training in Emergency
Medicine: The Simulation Module for Assessment of Resident Targeted
Event Responses (SMARTER) Approach.  Rosen, MA et al, Sim Healthcare 3:170–179, 2008

An event based approached to training and measurement, assesment and feedback. Systematic fashion to do several things: “1. Develop and maintain links between simulation scenario events, performance measures, and ACGME core competencies, 2. generate diagnostic measurement that feeds the processes of providing corrective feedback to accelerate kill acquisition and provide learning outcomes data rooted in the ACGME core competencies, 3. to provide opportunities to perform that are structured to maximize learning and assessment opportunities”

8 step process.

Each scenario designed to sample a specific part of the core competencies’ content (can’t measure everything at once — but can do multiple sims or have multiple different observations not using sim that can triangulate on the components of the core competency that you are trying to grade.)

A practical guide – with well delineated examples at multiple areas of the process of scenario creation – that are focused on EM and maybe critical care cases – but could be abstracted to other disciplines in formation/generation of scenarios.

Personal thoughts: Why didn’t I read this 7 years ago.. Check out for some nice examples from the Academic Life in Emergency Medicine crew of simulation cases linked to ACGME core competencies.

Debriefing and Simulation Trail Mix – a little bit of a bunch of things

Photo Credit: CapsLK via Compfight cc

Photo Credit: CapsLK via Compfight cc

Alternative Educational Models for Interdisciplinary Student Teams. Judy L. LeFlore: Sim Healthcare 4:135–142, 2009

13 teams – randomized to Self Directed  (described in this paper as allowing students to proceed through the scenario with little or no input from the instructor until after the scenario) learning with facilitated debriefing vs. instructor modeled learning (defined in this paper as instructors model the appropriate responses and interventions during a team simulated clinical scenario while the students or participants observe – with the experts verbalizing what they are doing and why it works in the scenario — this act then followed by students participating in a simulation) with modified debriefing afterwards. Team content (all seemingly novice students – NP, RN, RT, and social work). Reported debriefing was performed by someone trained in debriefing through a comprehensive simulation workshop and debriefing workshop – not stated what exact question styles were used.

Results: Instructor Modeled learning (IML) – had no effect difference on knowledge test vs. Self directed learning with facilitated debriefing. However – the IML groups had higher satisfaction score, and higher overall teamwork and CRM scores (behavioral asssessment checklist); IML team also had quicker times to intervention based on the Technical Evaluation Tool used.

Questions from me? People are generally happier when the explicitly know what is expected of them.Did the IML just teach to the test/evaluation? Is that necessarily a bad thing? Does this work better for novice vs. expert learners? How do we model what we want students to do?  Video, other fashions, simprov.  Personally I have had our residents watch videos of teams prior to coming into our simulations – (shoutout! to the Stanford AIM LAB )

Directed self-regulated learning versus instructor regulated learning in simulation training. Ryan Brydges. Medical Education 2012: 46: 648–656

First – What is  Directed Self Regulated Learning?  Well ” It requires a knowledgeable educator to design practice conditions using validated learning principles. A trainee then steps into this structured setting and is given a limited control of a specific aspect of practice and therefore is metacognitively, behaviorally, and motivationally active in the learning experience.”

Instructor regulated learning as described in their paper is similar to what many centers do for procedural group training 1 instructor, 4 residents, everybody gets a turn/gets to demonstrate skill  – then you move on.

45 PGY1 Internal Medicine Residents – Standard pre-training curriculum /standard testing/questionnaires pre/post educational intervention – assigned randomly to Directed Self Regulated Learning (DSRL – solo participant could choose to progress from easy to hard LP simulator models on their own time in a 35 minute time frame, could play pre training video ad lib during training session – then take the post test – then get 15 minutes for feedback)) vs. Instructor regulated learning (IRL) for teaching lumbar puncture using simulation (instructor ratio 1:4, did not have access to video during session – as instructor was resource, instructor and students collectively decided when to progress from easy to hard – still within  a 35 minute time frame for the 4 students (as opposed to 1) – then take post test – then get 15 minutes feedback as a group).  Then everybody gets a 3 month retention test.

23 residents completed the study. Trained/blinded experts rated videotaped procedures.

Pre and post test findings were similar (although IRL post test trended higher) — but at the 3 month time frame – better retention trend in checklist score, and significant score increase  in the GRS (global rating score) of the DSRL group.

Maybe this speaks to something about being responsible adult learners, and being invested in their own learning… Multiple moving parts here, small study, food for thought.

Comparison of Postsimulation Debriefing Versus In-Simulation Debriefing in Medical Simulation. Jon N. Van Heukelom, MD; Sim Healthcare 5:91–97, 2010

Both post simulation and in-simulation debriefings have been used in medical simulation, the question remains which method is more effective? The authors compared these two styles of debriefing with 161 third year medical students participating in ACLS simulations. A retrospective pre-post assessment was completed by the students on “self reported confidence, level of knowledge related to medical resuscitation and the simulation itself.”

The post simulation debrief group gave higher rankings for “effectiveness of the debriefing style, debriefing leading to effective learning, and understanding of correct and incorrect actions.” These results were statistically significant, and students rated this method as more effective overall.

The discussion noted advantages and disadvantages of each method and its effect on the quality of learning. Their students however “did not feel that interruptions during a simulation significantly altered the realism of the simulation.”

Limitations of the study noted that this is one level of learners in one type of simulation, thus the inability to generalize to other levels of learners or other types of simulations. It did not include clinical outcome data nor did it specify the amount of time spent in simulation versus debriefing for each group (total length of time =20 min).

When Things Do Not Go as Expected: Scenario Life Savers: Dieckmann, et al. Sim Healthcare 5:219–225, 2010

Dieckmann et all discuss the need to build into scenarios the possibility that learners will not proceed down the paths that the instructor considers possible or likely. This might result from altered learner comprehension of the meanings and clues within a scenario, inability to accept the scenario as plausible, or mismatch between the scenario difficulty and the learner’s ability.   These situations might compromise the potential for learning in a simulation.

“Life savers” attempt to rescue the scenario and make it meaningful and relevant for the learner. These life savers can be brought into the scenario from within or outside the scenario. Examples of altering the scenario from within would include having a confederate administer a drug that is necessary to the progress of the scenario when the learners have failed to do it, or manipulating vital signs on the fly from the control room to make manikin status changes more or less obvious. Examples of altering the scenario from outside would include speaking via the overhead speaker into a scenario to stop participants from doing an action that would be harmful to themselves or to the manikin, or stopping and restarting a scenario to recover from control room or technology mishaps.

Life savers that are employed by a role player from within a scenario must follow the logic of the scenario. They must make sense to the learner in the context of the situation. By contrast, life savers brought from outside the scenario do not, and in some cases can intentionally disrupt the scenario.

This has implications for both scenario design and prebriefing. During design of scenarios, instructors should consider the potential need for life savers, and how they might be implemented if necessary. The learners should be prepared for this possibility during the prebrief.

Serious Games

Serious Games for Education and Training. De Gloria, A et al. International Journal of Serious Games. Vol 1, (1), 2014

“Contextualizing the player’s experience in challenging realistic environments, supporting situated cognition.” Allow players to be in a play environment to explore formal learning in a wide variety of situations, even to mimic work in “complex/costly environments or dangerous/critical situations.”

This is a great soup to nuts review (for a non-serious gamer) of the state of the art of serious gaming.

Interesting concepts:
1. Constructivist learning theories in gaming – knowledge created through experience while exploring the world and performing activities. But have to be mindful of cognitive load theory
* game has to balance the inherent challenge and the players ability to address and overcome the challenge
2. “Flow” to measure engagement in an educational game (has 8 components)
concentration, challenge, skills, control, clear goals, feedback, immersion, and social interaction.
3. Assessment and feedback through the serious game
– different learning analytics through which to provide feedback (may also include cameras, eye trackers, vital signs), as well as assessment to figure out what the leaner has learned.
* this is not mentioned in this article – but check out (play an interview – and see the feedback you get…)

figure 4 – reminder of Bloom’s taxonomy and Kolb’s learning theory – also brought up was the Nonaka SECI model (socialization, externalization, combination, internalization)

From Content to Context: Videogames as Designed Experience. Squire, K. Educational Researcher Vol 35, no 8, pp 19-29, Nov 2006
“videogames as a designed experiences, in which participants learn through a grammar of doing and being.”
Table 1: Interesting aspects of components of the learner and game as interface – that easily apply to learning goals in bloom’s taxonomy and Kolb’s learning stages.

Interesting side tales about the Grand Theft Auto game, as well as Civilization III, and a game called Supercharged! (to help learn about electrostatic physics.) Not everything in serious games is overtly on the military front.

Possibility for anonymity as a member of a game, being able to experiment, learn through successes, learn through failure, redo and improve, to become expert problem solvers.
Learning from a game is only as good as the work that goes into the designed experience for the learner.

A Comprehensive Review of Serious Games in Health Professions.  Ricciardi F, De Paolis LT. International Journal of Computer Games Technology. Vol 2014, Article ID 787968

In their review of serious games developed for health professions, the authors described the “state of the art of serious games” and sought to understand if they are useful tools for training as well as the benefits and the current issues related to them.  The review was grouped by the area of application ( surgery, odontology, nursing, cardiology, first aid, dietician and diabetes, psychology and other.) They concluded that serious games for health professions are not as widespread as expected and the highest number of games was found in the field of first aid.   The focus of the papers was mostly on development of serious games and how they were used in a particular area of health.

***In the serious games review – only a few included an evaluation component. ( no standardized way to do this in serious games.)

Another issue identified was the lack of a multiplayer component to many of these games; Since healthcare now emphasizes the importance of a teamwork approach, the authors thought this to be a shortcoming.

Four factors were identified that highlight the differences between serious games from traditional simulators: entertainment factor, developmental costs, developmental time, and deployment costs.  The entertainment factor was the obvious advantage and each of the other factors were claimed to be lower in cost for the serious games.

Questions from the reading:  Who develops the games? Who states the game are valid for the skills trying to be learned? Time/Cost analysis for development/testing and use of serious games vs. simulation training?

 de Wit-Zuurendonk L, Oei S. Serious gaming in women’s health care. BJOG 2011;118 (Suppl. 3): 17–21.

A clipped quote from the article “Systemic search in pubmed – this search revealed zero hits for serious gaming in women’s health care.”

Learning background theory: Four elements important to adult learning – autonomy, past experience, goal oriented (or orientated), problem based learners rather than content oriented.

People enjoy gaming, some evidence that serious games are associated with increased technical skills. Male students like gaming more than female students (from data in the supplement).

In a non-technical skills pathway: One study that showed that triage was better after serious games vs. a card based game.

Cons of serious games – solo work; not able to fully replicate team dynamics;

Take Away thoughts from the morning:

Not a ton of data yet about efficacy of serious gaming in relationship to medicine (but taking the leap of faith from the aviation community that we have to medical simulation…)

Could serious games be the pre-reading/preparation work of the future for in person simulation training?

Interesting applications are out and about, they are not all military based anymore.

The VCUHS Geriatrics Division along with the school of medicine developed a serious game of sorts – more inter professional interaction, with faculty facilitation – using simulated patient information, not in real time, not in virtual reality, but I think this counts.