MEdIC Series: The Case of the Difficult Debrief – Expert Review and Curated Community Commentary

Posted by Tamara McColl, MD FRCPC on

Our second case of season 5, The Case of the Difficult Debrief , presented the scenario of a budding simulation educator who had a difficult experience debriefing a cohort of learners.

This month’s case was developed in collaboration with the team at Simulcast, an excellent simulation website operated by a team of emergency providers in Australia, whose work includes a online Journal Club based loosely on the MEdIC discussion concept. After reviewing this commentary, we encourage readers to check out their podcast that delves into some of the issues that arose from this month’s MEdIC case.

The MEdIC team (Drs. Tamara McColl, Teresa Chan, Sarah Luckett-Gatopoulos, Eve Purdy, John Eicken, Alkarim Velji, and Brent Thoma), hosted an online discussion around the case over the last 2 weeks with insights from the ALiEM community. We are proud to present to you the curated commentary and our expert opinions. Thank-you to all participants for contributing to the very rich discussions surrounding this case!

This follow-up post includes

  • Responses from our solicited experts:
    • Dr. Glen Posner (@gdposner ) is an Associate Professor and clinician educator in the department of Obstetrics & Gynecology at the University of Ottawa. He is the inaugural Medical Director of the Simulation Patient Safety Program at The Ottawa Hospital, and the Medical Director of the University of Ottawa Skills & Simulation Center. Glenn is also a Simulation Educator for the Royal College of Physicians & Surgeons of Canada. Dr. Posner’s program of research relates to the assessment of the intrinsic CanMEDS roles and patient-safety competencies using all forms of simulation.
    • Dr. Andrew Hall (@AKHallMD ) is an Assistant Professor in the Department of Emergency Medicine at Queen’s University where he is the Competency-Based Medical Education (CBME) Lead for the FRCPC-EM training program. He is a Simulation-based Resuscitation Rounds Instructor and runs the Simulation-based OSCE Assessment Program for EM residents. He completed a Masters in Medical Education through the University of Dundee in Scotland. His current research areas include CBME program evaluation, the development and validation of competency-based assessment tools and processes, and the use of simulation and novel technologies for assessment.
  • A summary of insights from the ALiEM community derived from the Twitter and blog discussions
  • Freely downloadable PDF versions of the case and expert responses for use in continuing medical education activities

Expert Response 1: Managing Debriefing Roadblocks (Glenn D. Posner, MDCM FRCSC, MEd)

“The Case of the Difficult Debrief” describes the frustrations of a medical educator who is clearly passionate about using simulation-based education, but has run into some common roadblocks. Her challenges can be divided into three spheres: 1) setting the agenda; 2) managing a perceived lack of “buy-in”; and 3) the ideal phrasing of questions.

Whose Agenda Is It, Anyway?

I notice that some of Eliza’s frustration arises from the learners deviating from her perceived case objectives and discussing points that she felt were unnecessary or did not want to talk about. It is unclear whether she outlined the case objectives with the resident group prior to the debrief. During a debriefing, we certainly want to cover the intended learning objectives, but it is also important to take the participants’ agenda into account and allow them to guide some of the discussion. I do not presume to know Eliza’s frame in this context, but she seems to be adopting an attitude that a debriefing is just like a teaching session with material that she needs to “get through.” As a senior simulation faculty member, I would attempt to re-frame Eliza’s frustrations by suggesting that a debriefing, unlike classic structured teaching sessions, is a very different educational tool in that the learners are subjected to what is often a highly stressful and emotionally-charged situation, which can entice really interesting and unexpected topics of conversation. These unanticipated discussions account for much of the richness of theatre-based simulation training, but also some of the anxiety for novice debriefers.

Participants never talk about the wrong thing (the customer is always right?), they talk about what they find interesting or personally challenging. Eliza will have her own list of performance gaps that she needs to close during the debriefing, but I suggest that she pay close attention during the reactions phase of the debriefing1 and build a parallel agenda based on the ideas expressed therein. Asking how it went is not the same as asking how it felt. Furthermore, she might want to experiment with the use of the well established “plus/delta” approach during the analysis phase of her debriefing. Using this approach, she will be able to compare the positive aspects of the team’s perceived performance (this represents “plus”) with her own notes, and she will explore the perceived gaps she needs to close (this represents “delta”). This will also allow her to determine which concepts the group already has insight about and which require more discussion. By being flexible and nimble with the agenda, this learner-centered approach shows respect for the learners and leads to a more satisfying experience for everyone. One can always address objectives that were not adequately covered during the summary phase of the debriefing. Each developed simulation scenario should certainly have a list of learning objectives, but one does not need to be dogmatically married to them. It is important that simulation educators demonstrate flexibility and creativity in how objectives are met and learner responses are managed for each unique group of learners who complete a simulation scenario.

Hiding in the Foreshadows – the Prebriefing

One of the most frustrating parts of being a simulation educator in general, and a debriefer in particular, is combatting a lack of buy-in from participants. I try to practice out-maneuvering a lack of buy-in by focusing the group’s attention on elements of the scenario that did feel real for them, and asking them how they would manage that part of the situation. This usually redirects them to discussing the case rather than their perceived lack of realism and you can still be successful in highlighting important points. I never get defensive or allow myself to get dragged into a conversation about how we could make the scenario more realistic. Rather, I agree with them and help them to find some clinical relevance and educational value from the case. More importantly, I find that nothing mitigates this line of complaints during a debriefing better than conducting a very thorough pre-briefing2. Every simulation educator needs a “pre-flight checklist” of things to discuss during the pre-briefing, including but not limited to: orientation to the equipment, the setting of the scenario, the level of training of the participants, the role of the confederates, the fiction contract and the need for a suspension of disbelief, confidentiality, and their Basic Assumption of simulation (i.e. the belief that everyone is intelligent, capable, cares about doing their best, and wants to improve). I do not want to assume that Eliza’s pre-briefing was lacking, but I have found that many of the complaints I hear about realism can be almost completely eradicated by addressing those concerns during the pre-briefing.

Keep Calm and Stay Curious!

I also notice that Eliza’s description of her analysis phase sounds like she relies heavily on the socratic method of teaching. In my experience, debriefing is not about asking questions, it is about facilitating a conversation. Eliza reveals her frame when she speaks of “hinting” and “mistakes” and I would initiate a few key strategies to re-frame her logic. First, hinting by an educator can be perceived as judgmental, so the “advocacy-inquiry” approach to analysis urges us to come out and say what we think3. For example, “Mike, I noticed that you ordered propofol for the airway induction when the patient was hypotensive. I wonder if you could walk me through your decision to use this drug?” This approach is counter-intuitive to those who feel that better long-term retention of learning will occur when we can get the learner to say the “correct” answer. However, this leads to my second point, which is that actions perceived by the educator as “mistakes” might actually be correct when viewed from a different angle. Advocacy-inquiry urges us to try to see errors not as simple mistakes that need to be identified and “fixed,” but rather as interesting insights into someone else’s frame of mind. This presents an opportunity for in-depth discussion and the possibility of re-framing the learner’s thought process to prevent such errors in the future. The key here is that mistakes made in a simulated session are not necessarily negative; rather, perceived mistakes are fascinating and can lead to rich discussions and long-lasting learning! If the concept of advocacy-inquiry seems too complicated, then the most important principle to remember for successful debriefing is to remain genuinely curious about why participants acted differently than you would have. It is not enough to recite the Basic Assumption; the key to holding the Basic Assumption is to always imagine and truly believe that there is a possible universe in which, under the right circumstances, from a different point of view, the actions taken were actually correct.

Conclusion

Debriefing after a theater-based simulation can be one of the most rewarding and satisfying educational events for an educator to be involved in, and it has admittedly developed a bit of a mystique around it. However, let me assure novice debriefers that it is not “rocket science” – the key to success is to stay curious and tenaciously hold on to the Basic Assumption. A good debriefing happens in a safe container (established during the pre-briefing) and involves the sharing of ideas, the closure of performance gaps, and re-framing when necessary. Happy debriefing!

References

  1. Eppich W, Cheng A: Promoting Excellence and Reflective Learning in Simulation (PEARLS): development and rationale for a blended approach to health care simulation debriefing. Simul Healthc. 2015, 10:106–15. 10.1097/SIH.0000000000000072
  2. Rudolph JW, Raemer DB, Simon R: Establishing a safe container for learning in simulation. Simul Healthc. 2014, 9:339–49. 1097/SIH.0000000000000047
  3. Rudolph, J. W., Simon, R., Dufresne, R. L., & Raemer, D. B. (2006). There’s no such thing as ‘nonjudgmental’ debriefing: a theory and method for debriefing with good judgment. Simulation in Healthcare : Journal of the Society for Simulation in Healthcare, 1(1), 49–55.

Expert Response 2: Sharing the Experience (Dr. Andrew Hall, BHSc MD FRCPC MMEd)

Something Different

“If you can debrief in the clinical environment, you can debrief in the sim-lab!”…These words were uttered by this ‘expert’ writer just last week at a Simulation Summit workshop in Montreal. One of the more astute instructors adequately pointed out, “I don’t know about that Andrew. I think there is something different about debriefing in the sim-lab”. This statement gave me pause to think: is this true? And if so, what is that difference?

Voyer and Hatala1 argue that debriefing in the sim-lab is very similar to providing feedback in the clinical environment. In fact, they suggest that, in comparison to the clinical environment, some features of the simulated environment “… positively influence the exchange of meaningful feedback” – the ability to start with a pre-brief, the opportunity to perform direct observation without interference, and the fact that trainees can make mistakes without causing harm. Shouldn’t these advantages make debriefing in the sim-lab easier than providing feedback in the clinical environment? So why is this not always the case?

In the simulated environment, the case is not real. It is a made-up situation with a set of priorities that are pre-established. Rather than both parties observing and experiencing the case together, one party (the educator) has a predefined image of what should happen. My experience with most simulation debriefs is that conflict arises between what the educators feel the trainee should have done, and what the trainee feels the trainee should have done. This conflict is usually rooted in the fact that the trainee’s experience of the case was different than what the educator had planned. If the educator fails to recognize this different experience, then an unavoidable tension develops right from the start of the debrief, resulting in problems such as trainee defensiveness, debriefs that “talk about the wrong things”, and debriefs that just do not work.

Good Debriefing

Debriefing is important. In a systematic review of the features of medical simulations that lead to effective learning, Issenberg et al.2 concluded that providing feedback is the most important feature of simulation-based medical education. Furthermore, the perceived skill of the debriefer has the highest independent correlation with the perceived overall quality of the simulation.3 As such, many excellent methods and strategies have been developed for debriefing and most debriefing experts would now be comfortable accepting that there is more than one way to perform an effective debrief.4

I do not endorse any one specific strategy over another, but rather look to well-established principles of medical education to guide my debriefing. I particularly appreciate Fanning and Gaba’s3 reference to debriefing as a “facilitated or guided reflection in the cycle of experiential learning.” This framework, however, does not provide clear or concrete steps on debrief facilitation. Kolb’s Experiential Learning Cycle5 seems to combine the best features of debriefing models into one simple, easy to remember concept: By first reviewing the experience, then drawing conclusions from the experience, an effective facilitator can lead the participant in a discussion of the events to reflect, plan future actions, and produce long-lasting learning.

You Are More Important Than Me

It is important to check-in and remember why it is we are doing what we do. There is a famous quote by Socrates that I have adopted as a teaching philosophy: “Education is the kindling of a flame, not the filling of a vessel.” The implication of this quote is that I, the educator, should not be attempting to fill you, the learner, with my knowledge (or priorities), but rather should be attempting to discover what ignites your learning process. We spend much of our time on our objectives instead of understanding what actually happens in our simulated cases. We must observe what the learner experienced, not what they did. Discussing whether or not a trainee checked a glucose is less important than discussing what experiences resulted in them not checking that glucose. Many effective feedback strategies start with this requirement of coming to a shared understanding of what happened as for example, in the Reactions and Description phases of the PEARLS framework.6

With this powerful observation, we can guide the learner via a process of debriefing that is facilitative and can induce deep learning. Susan can help Eliza by suggesting that for a little while, she ignore her objectives, and start at the top of Kolb’s Learning Cycle. Questions such as: What did you experience? What did you see that led you to perform the actions you did? What could you have done differently and why? What could you do differently next time? In the best debriefings that I have experienced (either as a learner or debriefer, in the clinical environment or in the sim-lab), relatively little has been said by the debriefer themselves. The learners guide the discussion.

References

  1. Voyer S, Hatala R. Debriefing and feedback: two sides of the same coin? Simul Healthc. 2015;10(2):67-68.
  2. Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005;27(1):10-28.
  3. Fanning RM, Gaba DM. The role of debriefing in simulation-based learning. Simulation in healthcare : journal of the Society for Simulation in Healthcare. 2007;2(2):115-125.
  4. Sawyer T, Eppich W, Brett-Fleegler M, Grant V, Cheng A. More Than One Way to Debrief: A Critical Review of Healthcare Simulation Debriefing Methods. Simul Healthc. 2016;11(3):209-217.
  5. Kolb DA. Experiential Learning: Experience as the Source of Learning and Development. London: Prentice Hall; 1984.
  6. Eppich W, Cheng A. Promoting Excellence and Reflective Learning in Simulation (PEARLS): development and rationale for a blended approach to health care simulation debriefing. Simul Healthc. 2015;10(2):106-115.

Curated from the Community (Dr. Eve Purdy MD, FRCPC Candidate, MA Candidate)

This week’s case is pretty meta – debriefing the debrief. As readers we entered the control room and became privy to a behind the scenes discussion between 2 educators. I felt like a spy! Eliza, a new simulation educator, was dejected after what she considered a “rough session.” Her boss, Susan, listened to her vent. Eliza’s concerns centered around residents not buying into the scenarios, debriefs not going according to plan, and thoughts that she may not be good enough for the job. The audience was asked to consider a number of challenging questions including the differences between being a good bedside teacher and a simulation and debrief facilitator, the ideal content of a debrief, and how to navigate priorities during a simulation session. Furthermore, we were asked to reflect on how Susan might help Eliza. The enthusiastic simulation community did not disappoint. The robust virtual community of practice came out to thoughtfully discuss the case. A number of themes emerged.

Teaching Students to Sim

The first theme that emerged was the importance of the pre-brief. Eliza mentioned that residents were focused on fidelity issues with the mannequins and lack of buy in from the scenarios. Discussion participants reflected on the fact that the pre-brief may not have been adequate. Ben Symon suggested that Eliza may not have fostered a sufficient foundation of trust for the simulation to play out as she initially envisioned. Rob Bryant outlined the pre-briefing practices at his institution which include 30 minutes to discuss the basic assumptions, to create psychological safety, to suspend disbelief, and to define learning objectives. These practices are certainly necessary, but perhaps they are not sufficient?

Two participants, Damon Dagnone and Vic Brazil, pushed the community to consider going beyond the pre-brief when it comes to teaching students how to sim. Vic reflected that we spend a great deal of time and effort learning how to be effective simulation educators, but perhaps we should be spending a similar amount of effort having learners understand how to be effective simulation participants. She reminded us that learners have a responsibility for the debrief, too. Damon outlined what might be the simulation educators dream, but a situation that is unlikely to be the norm: At Queen’s he has sustained and frequent interactions with residents in and out of the simulation lab over 5 years of training. He reflected that this longitudinal interaction allows him to build trust with residents and cultivate their effectiveness as a simulation participant over years rather than hours.

As learners engage more frequently with simulation throughout their training, they are likely to become better simulation participants. But we might be able to help them in that process by explicitly teaching learners how to sim.

Facilitating, Not Teaching?

A second focus of the discussion was related to whether the skills of a simulation facilitator overlap with the skills of a medical educator. The community seemed to agree that while skills that make a good facilitator would certainly benefit at the bedside, being an educator does not a master facilitator make.

Adam Cheng highlighted the differences between teaching and facilitating. He wrote, “Being an effective facilitator may require a mindset shift for some educators; from being present for the purposes of TEACHING, to being present for the purposes of FACILITATING LEARNING.” Rob Bryant further suggested that “learning how to teach by listening, rather than teaching by telling, is an acquired skill even for accomplished bedside/on shift teachers.”

Many reflected on the fact that we should be doing more facilitating out of the simulation lab. Shannon McNamara shared how she uses the facilitation model in clinical teaching.

“I find myself doing much more clinical debriefing these days – reflecting on our team performance and individual learner decisions in the clinical setting. The tools are the same. Advocacy inquiry works great in the ED.” She goes on to say, “I find the just culture model to be essential, paired with the basic assumption we use in simulation.” Next she argues that, “what makes a good simulation debriefing – productive, respectful, constructive reflection on performance – makes good bedside teaching”.

This conversation encourages simulation facilitators to bring their skills in creating a psychologically safe environment and in debriefing to clinical shifts to elevate the performance of their learners and other team members. What happens in the sim lab stays in the sim lab, but your skills do not have to!

The Good Debrief

Participants were asked “what makes a good debrief?” The answers were almost as wide-ranging as the number of individuals in the conversation. The community’s response to this question highlighted the heterogeneity in approach of the educators involved and is evidence that there probably is no one right way to do this. A few common themes included:

  • Did the debrief meet its predefined objectives? If the answer is no, this does not mean the debrief was ineffective. Community members offered ways to cover objectives even when the debrief did not lead in that specific direction. If the debrief did not meet predefined objectives, why not? Reflection on the pre-brief, the case itself, the learners involved, and the debrief style are the next steps in unpacking why predefined learning objectives were not met.
  • Honesty. Many members of the community stated that having learners guessing what went wrong or baiting them into learning goals is not appropriate. Simulation facilitators should be direct with their observations then genuinely curious about what occurred and why. This is the advocacy-inquiry approach.
  • Juggling tension between facilitator and learner goals. The community seemed to agree that, in simulation, tension often develops between what the facilitator was hoping to cover and what the learners identify as goals. This tension is managed in variable ways. Adam Cheng’s approach of tackling the common agenda first seems to make sense. His entire approach is in Figure 1. The community also encouraged each other to consider whether their debriefs are more facilitator driven or learner driven and to consider what context is most appropriate for which style.
  • Debrief tools. Ben Symon suggested that there are an immense number of resources and tools available to improve debriefing skills. The group compiled quite a collection of resources. A list of those mentioned is available in the resource section.
Figure 1: Cheng Approach to Prioritizing the Debrief Agenda:

1. Tackle common agenda items first

2. Issues that relate to critical/life threatening errors or patient safety concerns

3. High priority learner objectives (multiple learners have expressed interest)

4. Instructor agenda items

Debriefing the Debrief

Finally, the community reflected on the interaction between Eliza, a rookie simulation educator, and her more experienced boss, Susan. Glenn Posner highlighted how the patience and active listening skills of Susan speak volumes about her actions as a mentor. Before deciding what Susan should say or do next, George Mastoras suggested that she ask Eliza what she needs. Perhaps Eliza just wants to vent or maybe she is actually ready for a true debrief of the debrief. Asking struggling colleagues if they are ready for a deep dive is appropriate.

The community suggested that the same skills that make a good simulation facilitator will help Susan navigate this interaction with Eliza. Vic Brazil points out that “one of the challenges in debriefing is knowing how to judge our performance, and recognizing that we might be just as insightless as some of our learners.” As such, Eliza can employ some of our standard facilitator debrief techniques and perhaps offer to observe Eliza’s next session so that she can engage in a true advocacy-inquiry approach.

Damon Dagnone reminded educators to be gentle on themselves, a welcome suggestion for more junior educators in the crowd.

The Curator’s Personal Perspective

As curator of this month’s discussion, I was again humbled by the efforts, thoughtfulness, reflective practice, and concern that this group of educators has for their learners. Often in medicine, the patients we work the hardest for are the ones who appreciate us the least. We continue to work hard for these patients because we know that recognition is not the purpose of our job. It strikes me that learners, like patients, often have very little insight into how hard our teachers work and how much they care. Being privy to the efforts, struggles, enthusiasm, and degree of caring demonstrated by this community as a curator of your thoughts is always an immense privilege. Thank you for all what you do and double thank you for sharing your expertise while exploring this rich case.

Contributors

A special thank you to all our readers who participated in the online discussion:

  • Adam Cheng
  • Ben Symon
  • Bonwen @bronespen
  • Damon Dagnone
  • Eve Purdy @purdy_eve
  • George Mastoras
  • Glenn Posner
  • Rob Bryant
  • Shannon McNamara
  • Tamara McColl @TamaraMcColl
  • Teresa Chan @TChanMD
  • Vic Brazil @socraticEM

Resources

Cheng, A., et al. 2016. Learner-centered debriefing for health care simulation education: lessons for faculty development. Simulation in Healthcare; 11(1): 32-40. http://journals.lww.com/simulationinhealthcare/Fulltext/2016/02000/Learner_Centered_Debriefing_for_Health_Care.5.aspx

Eppich, W., & Cheng, A. 2015. Promoting excellence and reflective learning in simulations (PEARLS): development and rationale for a blended approach to health care simulation debriefing. Simulation Healthcare; 10(2): 106-16 https://www.ncbi.nlm.nih.gov/pubmed/25710312

PEARLS Healthcare Debriefing Tool https://debrief2learn.org/pearls-debriefing-tool/

Bajaj, K. et al. 2017. The PEARLS healthcare debriefing tool. Academic Medicine; epub ahead of print. http://journals.lww.com/academicmedicine/Citation/publishahead/The_PEARLS_Healthcare_Debriefing_Tool.98069.aspx

DASH debrief tool https://harvardmedsim.org/debriefing-assessment-for-simulation-in-healthcare-dash/

OSAD assessment of debriefing tool https://www1.imperial.ac.uk/resources/CFE7DECB-8FE7-437C-8DAA-6AB6C5958D66/debriefingosadtool.pdf

Case and Responses for Download

Click here (or on the picture below) to download the case and responses as a PDF (182 kb).

Author information

Tamara McColl, MD FRCPC

Associate Editor, ALiEM MEdIC Series
Emergency Physician, St. Boniface Hospital, WRHA
Academic Lead, Educational Scholarship
Department of Emergency Medicine
University of Manitoba

The post MEdIC Series: The Case of the Difficult Debrief – Expert Review and Curated Community Commentary appeared first on ALiEM.


Go to full site