Safety Culture: Building a Culture of Safety

Safety Culture

Building a Culture of Safety

In the 10-plus years since the inaugural publication of the Institute of Medicine (IOM) study on medical error, To Err Is Human, there has been surprisingly little progress in reducing the rate of medical error, despite the adoption of technologies specifically intended to combat medical errors. A growing number of people attribute this lack of progress to fundamental flaws in the American healthcare culture that prevent success.

John Nance identifies elements of this culture in his book, Why Hospitals Should Fly. Nance compares the current culture of hospitals to the culture of airlines before the adoption of current safety standards. Chief among the cultural barriers to a safer practice is the notion that errors are something that happens to someone else, or that only “incompetent” providers commit errors. In fact, authors such as Michael Cohen of the Institute for Safe Medication Practice, Charles Denham, MD, of the Texas Medical Institute of Technology and Lucien Leape, MD, of the Harvard School of Public Health have all recently published that errors occur more often as the result of bad systems, than they do as the result of “bad people.”

Nance paints a picture of a fictional healthcare organization to argue his points. Its culture differs from those normally encountered today: its safety culture focuses on:
•    the recognition that people are human and will make mistakes,
•    the design of systems intended to catch those mistakes before they become errors, and
•    the review, perhaps even celebration, of those “near misses” in an effort to further reduce the opportunities for error.

It is easy to see how the mythology of perfection took hold in our healthcare system. No one wants to spend the day wondering, “Which of the mistakes I make today will really hurt somebody?” Yet, that is exactly where we must arrive. We are human, and intelligent human beings have an intrinsic error rate of about 3%. The first step in establishing a culture of safety is to get people to accept that they might make an error.

The same perceptual biases that help us navigate the chaotic world in which we live predispose us to these errors:

  • Confirmation Bias—we see what we expect to see. When we inspect any work (our own, or someone else’s), our expectations govern how we perceive that work. Our brains automatically attempt to impose our expectations on what we actually see. This means that, unless we expect something to be wrong and believe that we have to find the error, we tend to see what we expect to see, even when it isn’t there.
  • Illusion of Control—humans innately maintain the illusion that steps we take can control a situation, even when the evidence indicates that we are not in control. This causes us to focus on ritual behavior, even when that behavior adds no value, or worse, leads us astray.
  • Status Quo Bias—we prefer the current situation, no matter how bad, even when something better—or different from our perceived norm—might help us perform better. This “Hamlet bias” (“… makes us rather bear those ills we have than fly to those we know not of. Thus conscience makes cowards of us all and enterprises of great pitch and moment with this respect their currents turn awry and lose the name of action.”) represents the tendency to fall back into old habits, even when we know they may be counterproductive.

So if, as the IOM states, “to err is human,” then how do we build a culture of safety in a complex and chaotic environment like a hospital?
First, we must recognize that building a safe culture is not something to which we pay lip service on alternate Wednesdays. A culture of safety starts at the top of an organization and must color every decision made within that organization. No matter how much rhetoric exists about the importance of safety, it means nothing if management routinely engages in decisions not influenced by safety, but only throughput or productivity, for example. In one highly publicized medication error, the nurse involved was working overtime. Part of the root cause analysis for her behavior pointed to fatigue. Yet it also became apparent that the organization condoned and even encouraged such behavior by paying a bonus to the nurse who worked the most extra shifts. Data shows that, after approximately 12 hours of continuous work, a human is as compromised as if they have a blood alcohol level of 0.07%. That’s nearly legally drunk.

Second, we need to acknowledge that our systems, and not our people, are the most likely cause of errors. We must learn to use our errors and our near misses to ferret out the system elements that place our caregivers in harm’s way and change them. Rita Shane, PharmD, the director of pharmacy services at Cedars-Sinai Medical Center, describes a process she adopted to build upon this notion. Whenever her staff meets, she encourages them to tell stories about recent near misses so the group can wrestle with how those might be avoided. This method does not celebrate error-making, but capitalizes on the verbal traditions within healthcare to pass on knowledge of situations that need to be approached with caution.

Third, we have to recognize that no procedure or process within a healthcare system is so distant from patient care that it cannot contribute to patient safety, or the lack thereof. Do purchasing decisions promote the acquisition of product containers that are indistinguishable from each other, making proper selection difficult? Are look-alike-sound-alike items stored next to each other, making it easy to grab the wrong one? In analyzing the aftermath of a near miss or an error, participants need to “not stop” in their quest for root causes until they can find no more, and follow the trail of cause-and-effect as deep (or as high) as it leads within the organization.

Finally, we all need to recognize and correct at-risk behavior. This may be the hardest of all. In recognizing that systems are commonly to blame for medical error, we can become seduced into the notion that there is no personal responsibility, but the reality is that safe systems are implemented only when the people within those systems are compliant with safe practice. Choosing not to use technologies such as barcode medication administration (BCMA) when it is available, or worse, using it incorrectly, is at-risk behavior.

Mark Neuenschwander, of the unSummit, notes the requirement for healthcare providers to each individually realize their infallibility. He describes a nurse’s story of her “epiphany moment” and recognition of her potential to make errors. She realized that had she ignored the BCMA system and administered medications she “knew” were correct (they weren’t), she might have harmed someone. When we are fortunate, that epiphany arises from a “there but for the grace of God go I” reaction to the tragic involvement of a colleague in a medical error. All too commonly, however, that epiphany comes from a “near miss,” or worse, involvement in a medical error.

Building a culture of safety involves everything we do, all the time. It requires constant growth and questioning to promote continuous improvement at all levels of the organization.

Dennis Tribble, chief pharmacy officer of Baxa Corporation, is an expert on health-system pharmacy operations, patient safety, and related medication safety issues. A pharmacist and software engineer, he is passionate about the need for a complete restructuring of the pharmacy practice paradigm and the role technology will play in bringing about that vision. Tribble is a fellow of the American Society of Health-System Pharmacists (ASHP) Section on Pharmacy Informatics and Technology (SOPIT) and a charter member of the Pharmacy Informatics Task Force for the Healthcare Information and Management Systems Society (HIMSS). Extensively published, he serves as a reviewer on automation for the American Journal of Health-System Pharmacy. He may be contacted at Dennis.Tribble@baxa.com.

References
Institute of Medicine. (2000). To err is human: Building a safer health system. Kohn, L. T., Corrigan, J. M., & Donaldson, M. S. (Eds.). Washington, DC: National
Academy Press.

Nance, J. (2008). Why hospitals should fly. Bozeman, MT: Second River Healthcare Press.