Or Understanding “Human Error”

Humans make mistakes. Any system that depends on perfect performance by humans is doomed to failure. In fact, the risk of an accident is more a function of the complexity of the system than it is the people involved. Humans are not the weak link in a process. We are a source resilience. We have the ability to respond to unpredictable inputs and variability in the system. The contents of this post are based on the work of Sydney Dekker in his book “The Field Guide to Understanding Human Error.”

Based on Sydney Dekker’s “Field Guide to Understanding Human Error”

Professor Dekker is a pilot and human factors engineer. Most of his work comes from analyzing industrials accidents and plane crashes. One such crash was the accident in May 1977 where one jet rammed into another killing 583 people.

Now we can blame the pilot for the crash. Had the pilot performed better, this accident could have been avoided. If we remove such bad apples, the system works fine.

However on deeper inspection there were multiple causes (non-standardized language, bad weather, overly crowded runway, equipment issues, etc). It was not simply “human error” that caused this crash, but a series of problems. Understanding all these causes reveals that pretty much any pilot could have made this mistake. The system needs to change to promote pilot success.

Casting blame makes us feel like we’ve offered an appropriate response to a terrible event. However blaming does not improve the system so the next person doesn’t make the same mistake. In order to learn from our mistakes, we need to understand why they happened.

Local Rationality and Just Culture

No one comes to work wanting to do a bad job.

Sydney Dekker

The local rationality principle asks us to understand why an individual’s action made sense at the time. “The point is not to see where people went wrong, but why what they did made sense [to them].” We need to understand the entire situation exactly as they did at the time, not through the benefit of retrospection.

We balance the need to keep people accountable while acknowledging that most adverse events are not due to “human error.” We emphasize learning from mistakes over blaming individuals. We need zero tolerance for blameworthy events like recklessness or sabotage while not unfairly blaming individuals for system problems.

Just Culture Algorithm

The Just Culture algorithm asks a series of questions to determine the cause of an adverse event and offers an appropriate response. If an act was a deliberate act of sabotage, then severe sanctions are necessary. If reckless behavior led to the adverse outcome, the individual should be held accountable. However if the any individual’s actions in the same context could have led to the same result, then it is hardly fair to blame that person.

  1. Did the individual intend to cause harm? Did they come to work in someway impaired? This is sabotage.
  2. Did the individual do something they knew was unsafe? This is reckless behavior.
  3. Does the individual have a history of similar events with similar root cause? This person is not learning from prior mistakes.
  4. Would three peers have made the same mistake in similar circumstances? This passes the substitution test. It is a no blame error.

Analyzing Adverse Events

The single greatest impediment to error prevention in the medical industry is that we punish people for making mistakes.

Dr. Lucian Leape

The old school format of Morbidity and Mortality conferences pit the person who made the error against a room full of experts with the benefit of hindsight. This adversarial arrangement encouraged people to hide their mistakes. We needed a new approach if we wanted to encourage bringing errors into the light for analysis to learn from these mistakes. Dekker describes six steps.

Step One: Assemble A Diverse Team

The team should include as many stakeholder perspectives as are pertinent. In medicine, we would include physicians, nurses, technicians, patients and others. This team needs to have expertise in patient care (subject matter expertise) and in quality review. The one group not included are those who were directly involved in the adverse event. Their perspective will be incorporated through interviews, but they do not participate in the analysis.

Step Two: Build a Thin Timeline

In airplane crashes, investigators recover the flight recorder (black box) to create a timeline of events during the flight and conversations between parties. In medicine, we look at the chart to understand what happened and when. This is a starting point, but excludes the context needed to understand local rationality.

Step Three: Collect Human Factors Data

Interview the people directly involved in the adverse event to understand what happened from their point of view. This is best done as early as possible as memory tends to degrade with time. Understand what was happening in the room, why did they make the choices they did, and what was their understanding of the situation and why.

George Duoros presents a series of questions on the EMCrit Podcast to guide the collection of this human factors data.

Collecting Human Factors Data (George Duoros)

Step Four: Build a Thick Timeline

With the human factors data in hand, overlay this on the thin timeline to build a thick timeline. This presents the events as they occurred within the context under which the providers were working. You may need to go back to interview providers until you can understand what happened as they understood it at the time. Then we achieve local rationality.

Step Five: Construct Causes

We don’t find causes. We construct causes from the evidence we collect. The causes of the error are complex and are not readily available to be discovered. We need to work to understand and propose possible causes. One method of organizing the causes is in a Ishikawa diagram (or fishbone diagram).

Ishikawa (fishbone) diagram to analyze potential causes of adverse events. The adverse event is placed at the fish’s head on the right. Off the spine are potential areas where errors may arise. From each rib, place the potential error and supporting details.

Step Six: Make Recommendations

Brainstorm for potential solutions that would prevent others from having the same outcomes. Ideally recommendations are worded such that they are specific, measurable, achievable, relevant and time-bound.

Legal Considerations

Remember that information is protected. It includes patient data and as such is protected under HIPAA. Do not put it in publicly available platforms such as Google Slides or Zoom.

  • M&M slides/presentations should be on standard RUMC slides.
  • All slides should have the below QA language at the footer of each slide.
    • All agenda and minutes should have the QA language too.
    • M&M slides can be requested by outside entities and may become discoverable.
  • Slide content should only include items found in Epic, minus patient identifiers, no MRN’s
  • Discussion points, analysis, or potential solutions should not be included in the M&M presentation; however, these topics can be discussed verbally in the meeting.

The entire quality improvement process should be a safe space to encourage providers to examine their errors. As such, it is protected under the Patient Safety and Quality Improvement Act of 2005 (Public Law 109-41), signed into law on July 29, 2005. However, plaintiff’s attorney’s can request these slides during discovery. Use our approved slide template which includes the appropriate language, for example:

This document is privileged and confidential under the Illinois Medical Studies Act and should not be shared or distributed other than through the Quality Assurance Committee structure.

Remember also not to list on the slides what was done well or what could have been done better. You can ask the question on the slide, but leave the listing of items for discussion only. Do not write it down.

Recommendations for the Presentation

We have 40 minutes allotted for the presentation. Prior to presenting, meet with your team (attending, resident, nurse) to debrief the case, much like we do after codes.

  • Introduction: Remind the group that this is about learning and identifying systemic problems, not about blame & shame.
  • Nursing presentation: The nurse assigned to the case can introduce us to the case, walking us through any pre-hospital information, the initial presentation of the patient through triage and summarize nursing care. They will also provide the nursing perspective throughout — what is in the notes but also the external factors. The presenting nurse should talk to the triage and primary nurse for the patient and represent their thought processes to the audience.
  • Resident presentation: Present the physician’s perspective on the case, incorporating both the thin and the thick timelines.
  • Discuss the case: Identify potential causes possibly using a fishbone diagram with the group.
  • Look for systemic problems and solutions: The goal of the exercise is to identify potential solutions that would prevent a similar mistake from happening again.
  • Evidence-based topic review: Present at least two evidence-based items on the topic.

Paperwork

To present an M&M at Rush, there is a bit of paperwork to fill out and sent to Pamela Manning. These need to be completed 2 weeks prior to the talk.

  1. Conflict of interest form (takes about 5 minutes)
  2. M&M Flyer (takes about 10 minutes) and needs the name and title of the presentation as well as 3-5 objectives. The objectives should be written such that they are measurable (ie, avoid the word “understand”).
  3. A copy of the PowerPoint as a PDF to complete the CME forms.

Use the Rush sanctioned PowerPoint template.

Dr. Patwari will assign you an M&M for the case and a faculty mentor to help you along and answer any questions. Please email the PowerPoint to Drs. Heinrich and Patwari for review before presentation.

If you would like an external consultant to join, Dr. Patwari can help coordinate this. Remember, we need to give the consultant enough ahead notice.

Remember to reach out to the residents and faculty involved in the case to get their insight into the case that may not come across in the notes. Remember, that these cases can be challenging and to keep the aim of these discussions in what can be improved for the future. 

Media

References

  1. Sydney Dekker’s “Field Guide to Understanding Human Error”
  2. Angels of the Sky: Dorothy Kelly and the Tenerife Disaster
  3. EMCrit 249 – You Can Either Learn or You Can Blame – Fixing the Morbidity and Mortality Conference with George Douros
  4. The Patient Safety and Quality Improvement Act of 2005