In the Morbidity and Mortality (M&M) conferences of the past, a physician would present their case in front of a lecture hall of their peers, superiors and subordinates. They were grilled by the audience (all blessed with the benefit of hindsight bias, also known as the Monday Morning Quarterback bias). The individual was held accountable for the outcome, regardless of how much of it was their own fault. It was a pretty daunting endeavor. Rather than face the ire of their peers, people instead would hide their mistakes. Hidden mistakes are not available to improvement processes.

Medical errors not only from the improper behavior of healthcare professionals, but for other multiple reasons: the fast change of technology, uncertainty of data, lack of resources, complexity of patient conditions. Putting the blame squarely on the shoulders of the provider ignores all these other contributing factors.  

James Reason wrote in his 1997 book, Managing the Risks of Organizational Accidents, that blame and punishment are not solutions for medical error. In fact, those who committed the “error” are also negatively affected by the error. Those who commit the errors are in need of support. They also need the opportunity to make restitution and improve. These providers did not come to work intending to harm someone.  Blaming and punishing them only serves to add insult to injury. Additionally, blame and punishment disincentivize people from coming forward when recognizing a potential error. 

Given that “to err is human,” systems should acknowledge this human characteristic. Systems need to be designed to anticipate these “human errors,” mitigate or eliminate them. Simply blaming the human humiliates the provider without improving care for future patients. Our goal is to identify errors and build a better, more resilient system to prevent future errors. 

Just Culture: Blame vs. Accountability

If we don’t blame or punish, how do we hold people accountable for their mistakes? There is a difference between blame and accountability. Blame and punishment are only appropriate in rare instances, such as criminal acts: patient abuse, working while intoxicated or intentionally being unsafe. Our response to other errors needs to be just. 

James Marx said we shouldn’t blame those who may have intentionally chosen an action, but didn’t mean to cause harm. Of course, those saboteurs who intentionally choose to harm a patient should be sanctioned. James Reason expanded upon James Marx’s idea and devised the Just Culture algorithm.

The Just Culture algorithm for assigning accountability. Start in the top left corner and answer the questions as they arise. As you move from left to right, the provider has less culpability and the repercussions should be appropriately assigned. For example, the saboteur (far left) may be fired or even be up against legal charges. The provider who was involved in a system-induced error should not be held accountable for the mistake the system caused. 

Some general principles of the algorithm are: 

  • If an action was intended and the harmful consequence was intended, this is sabotage and the provider should be sanctioned. 
  • If the action was intended but the adverse consequence wasn’t, but substance use was involved, the provider may require treatment but should also be removed from patient care where the opportunity to cause future harm persists. 
  • If there was no substance abuse and harm wasn’t intended, but the provider wasn’t following common practice, this is reckless practice. This person may require retraining and a citation on their performance evaluation. We should take into account how plausible that common practice is, though. 
  • If three other practitioners would have done the same thing in the same situation (the substitution test), then we may need to help everyone make better choices, or better yet, redesign the system to prevent the error from happening. 
  • If none of these apply, we may have a blameless error and we need to fix the system. 

Another depiction of the Just Culture algorithm. This one includes the repercussions of the behavior in the square Post-It note style boxes. 

Just culture takes into account that the system plays a role in human behavior. We already asserted that errors will always happen. Even the most expert physician will make “errors.” Our systems should be designed to work within the conditions under which providers work. Workflows should be designed to add defenses to avert errors. Much like how a car cannot start when it is in gear to prevent damage to the engine or the car lurching forward, we should put mechanisms in place that don’t allow commonly known unsafe actions to occur.

Safe, Reliable and Effective Care = Culture + Learning + Leadership

There are several necessary components to having a culture that promotes patient safety. The IHI framework for Safe, Reliable and Effective Care has several components: 

  • Organizational culture: how groups of people’s values, behaviors, attitudes and competencies lead to safety, every person contributes to this culture
    • Psychological safety: feel comfortable bringing things up, asking questions and reporting errors 
    • Accountability: holding individuals responsible for acting in a safe manner and training them to do so 
    • Teamwork and communication: teams share understanding, anticipate needs, and apply standard tools for communication and managing conflict 
    • Negotiation: creating agreement on matters of importance to team members and patients/families
  • Learning systems: measure performance and help teams make improvements 
    • Continuous Learning: continually collecting and learning from errors and successes
    • Improvement & Measurement: strengthening work processes and patient outcomes using improvement science, including measurement over time
    • Reliability: using the best evidence to standardize care with the aim of failure-free operation over time 
    • Transparency: openly sharing data regarding safe, respectful and reliable care with staff, partners and families
  • Leadership: facilitating and mentoring teamwork, improvement, respect and psychological safety

Organizational Culture

PSYCHOLOGICAL SAFETY. Team members should feel comfortable making suggestions, trying things that may not work, pointing out problems, and admitting mistakes. In order to encourage this culture, the most senior person in the room should make themselves approachable, encourage feedback, respond to suggestions and value every team member’s input. As medical students, when something doesn’t seem right, you should feel comfortable communicating this. Telling your superior or evaluators that they may be wrong is a vulnerable position to be in. We need to have a culture that makes everyone feel safe to report potential errors. 

Every new employee at Rush must complete CREW Training, borrowed from the airline industry, that stresses everyone is responsible and should feel safe calling out unsafe practices.  

ACCOUNTABILITY: A Just Culture recognizes that people make mistakes, but also that reckless behavior cannot be tolerated. We differentiate between three different reasons why a person may make an error. Each one requires a different type of response. 

  1. Human error: inadvertently doing something other than what you intended to do.
  2. At-risk behavior: intentionally doing something that increases risk, but you don’t perceive the increased risk. 
  3. Reckless behavior: consciously disregarding a visible, significant risk. 

The Just Culture algorithm analyzes errors to determine which sort of behavior was at the root of the problems. 

TEAMWORK & COMMUNICATION. As part of a team, your responsibility is to communicate when you perceive things can go wrong. Everyone acts as the eyes and ears for the team to ensure that the team is working toward safety. 

NEGOTIATION. Two teams work together to find mutually agreeable solutions using appreciative inquiry (asking simple questions to gain the other’s perspective) and self-reflection (understanding your own interests and perspectives). 

Patients and families are considered part of the team. They can identify adverse events, inform clinicians and be advocates for their own health. 

Learning Systems

TRANSPARENCY. All errors should be reported. We should thank the person reporting, acknowledge the issue then communicate change efforts. Transparency is when people can see all the efforts. At Rush, we have an error reporting tool called TRIPPS that allows anyone who sees an error to report it. This is routed to the appropriate teams to explore what happened. Reporting is strongly encouraged. 

IMPROVEMENT AND MEASUREMENT. We need to measure the errors that we’ve reported, in order to see if changes made to subvert the errors are effective. Improvement is implementing a change to make our practice better. This effort doesn’t have to stem from an error, as we’ll see in the Quality Improvement curriculum. 

CONTINUOUS LEARNING. For continuous learning to occur, people need to collect data to recognize problems, work together to form solutions, try the solutions and observe the effects. From feedback, the team gets insight. 

RELIABILITY. A bad practice or error doesn’t always generate a bad outcome. For example, “shortcuts” to a proper practice may save time and, if there are no ill effects from using the shortcut, you’re encouraged to keep using it. This practice is called normalizing deviance and it erodes reliability. In order for people to feel comfortable avoiding shortcuts, the standard protocols need to be designed to work well in nearly all circumstances. 


Leaders play a critical role in supporting these systems. They need to encourage learning, reporting of errors and apply the concepts of improvement and reliability. They create a culture that can ensure psychological safety and just culture, while holding people appropriately accountable.