Culture of Safety: All Systems Go? Why So Little Progress?

 

January / February 2005

Culture of Safety


All Systems Go? Why So Little Progress?

Computerized physician order entry: Check. Bar coding technology: Check. Electronic prescribing: Check. Web-based error reporting system: Check. Enhanced policies on disclosure: Check. Many organizations are spending enormous amounts of money on technologies that have been touted as essential to patient safety, but progress seems slow.

I work with organizations that share a strong and genuine passion for improving patient safety. They have leadership support for moving the safety agenda forward and are committing resources, but few are reporting real and sustainable progress. They have made emotional and financial commitments, but the intellectual commitment, along with the willingness to acknowledge the need for dramatic changes in the way work gets done, may still be lacking. I have come to appreciate that real progress is possible only by challenging existing assumptions and admitting that what we currently know may not be enough to get us where we need to go. Organizations and providers must fundamentally change the way they work and the systems within which work is done if they are to become successful in the pursuit of patient safety.

On November 5, we celebrated the fifth anniversary of the release of the Institute of Medicine’s (IOM’s) seminal report, To Err Is Human, which described the magnitude of the challenge to reduce medical error and promote patient safety. The IOM presented many issues and possible solutions, and many in healthcare felt that the blueprint for creating safer systems had been established. The report spoke about the need to see safety as a systems problem, to create a non-punitive culture in which open discussions about safety are encouraged and rewarded, and to create collaborative learning opportunities through the open sharing of data grounded in a common taxonomy that describes not only the event but also the systemic factors that contributed to the event. It described the need for research and focus and included examples of how other industries have moved from highly hazardous to highly reliable. Recent articles discussing this five-year anniversary suggest that progress has been slow and quite unimpressive (Wachter 2004; Altman, Clancy & Blendon, 2004).

As I work with organizations eager to foster this culture of safety, I am repeatedly reminded of the complexity of the systems and structures of healthcare. It is difficult to erase the rules, traditions, and hierarchies so dominant in our culture and so often at the very root of our inability to create and support lasting and meaningful change. The overarching and often redundant (at best) and conflicting (at worst) activities that historically supported the organizations quality improvement, performance management, risk management, outcomes management, compliance, and accreditation (and probably a host of others) now seem to be creating significant overlap in functions with a focus on data collection rather than safety and accountability. Professionals working in these areas are eager to participate in the patient safety movement but often do so without carefully examining their traditional roles. The redundancy of activity is often staggering. Multiple individuals may collect vast amounts of data to accommodate internal and external demands, but often that data does little to inform the organization about the magnitude of the problems or the best approach for creating a sustainable resolution.

I have recently come to appreciate that many in healthcare think the “fixes” proposed by the IOM can be laid over our current systems and structures to create a culture of safety. Some institutions believe that if people just work harder and use more technology, safety will be the by-product. However, existing systems and structures must be redesigned in order to truly make measurable and sustainable progress. The manner in which data has been collected, shared (or more often not shared), and acted upon has limited progress and disenfranchised individuals and must be included in the redesign.

New Learning for All
Despite many years of experience in quality management, performance improvement, and risk management, I, too, needed to admit that my skills, techniques, and practices needed to change. My level of expertise was no longer sufficient to assist organizations move forward. I needed to learn about Six Sigma, high-reliability organizational theory, complexity theory, and industry practices traditionally not associated with healthcare if I truly was to assist in leading efforts in patient safety. I also needed to give up the notion that the only way to protect an organization from liability was to protect information from discovery. Luckily, there are a great number of individuals who can help move our thinking forward, and where appropriate, I have included their names as references. I will start with a quote from Albert Einstein that sums up my feelings about the current challenge associated with patient safety: “We can’t solve problems by using the same kind of thinking we used when we created them.” Let me provide you with a few examples of where I think we continue to struggle in patient safety because we have retained outmoded and illogical ways of thinking.

In the aftermath of the IOM report, many began touting the benefits of a blameless culture, though most continue to struggle with what the term “blame-free” really means and how to create it. Organizations often confuse the absence of blame with the absence of accountability. They struggle to create a blameless culture in their existing human resource framework or align their objectives for accountability with various labor contracts and service union mandates. Policies and practices currently in place are often predicated on a blame-oriented culture, a culture taught to focus on the “bad apples,” and thus will not align with the objectives of a blame-free culture. Individuals must learn that when speaking of errors or near misses, they must focus their attention away from the “who,” to the “what,” and the “why.” In this area, I have found the work of David Marx to be highly instructive and helpful.

The word “transparency” has been used a great deal over the last five years, but seems to mean different things to different individuals. To some, it means talking openly to patients or their family members when they may have been impacted by a medical error. To others, transparency may mean the sharing of data about error, outcomes, and all other aspects of care, both within and outside the organization for purposes of learning. Unfortunately, to most, transparency is a practice that is an invitation to expose the organization or the provider to claims of medical malpractice.

Those who advocate against transparency fail to appreciate that a secretive approach to data has failed to keep the plaintiffs’ attorneys at bay; more importantly it has failed to create a culture that fosters an understanding of the factors that contribute to error and ultimately the creation of solutions. Although the medical malpractice system is less than desirable, we must begin to think of the malpractice system failings as separate and distinct from the problems that we face relative to patient safety. We cannot wait for meaningful tort reform to occur (it may never occur) prior to elevating the candor of our discussions with our patients and each other. I am convinced that safety will improve only when organizations and providers are willing to openly discuss their failures, near misses, active recoveries, and processes. These discussions must reach the frontline care providers, who must be actively engaged in creating the solutions that will work for them. Forcing patient safety concerns within a hierarchal committee structure delays progress and often fails to create a solution appropriate for the frontline staff or empower them to understand their roles and responsibility in creating a safe culture.

An organization must not establish such policies and assume that all will immediately know how and what to disclose along with the appropriate means to engage in that dialogue. We tell physicians that they need to disclose error: it is, after all, the ethical and appropriate thing to do. Yet often organizations fail to give the providers the training they need to fulfill this duty successfully. John Banja, PhD, from Emory University and Dr. Gerald Hickson from Vanderbilt have written extensively about both the value and the mechanics of disclosure.

Another critical area where we seem to be floundering despite IOM guidance is in the collection and sharing of data to foster greater understanding about the systemic and contributing factors present when errors occur. Millions of dollars have been spent by healthcare organizations to build or buy electronic data systems. However, the data may now be collected in a more elegant fashion but still remains locked up in someone’s file drawer for fear that it will expose the organization to potential liability. Our progress in patient safety is not slow because we are unaware of what needs to change; it is slow because we do not have the tools or techniques to bring about the change.

Healthcare will continue to struggle to make progress in patient safety until we are willing to change ourselves and the organizations in which we work. The silo-oriented structure in which we used to work, the beliefs and conventions that governed how we shared what we learned throughout the organization and responded to issues of quality and safety may have been appropriate for the past, but do not serve us well today. Working harder will not solve the problem if the work itself does not change! Furthermore, having more data or an electronic system to capture the data will also not improve our chances for success unless that data truly informs the frontline caregivers of their perils and opportunities and engages them to be part of the solution. This can only be done if they have confidence in the organization’s commitment to a blame-free culture, if they have the data in their hands to understand the nature of the problems they face, and if they have the tools and resources to assist in creating the solutions.

References

Wachter, R. M. (2004, November 30). The end of the beginning: Patient safety five years after “To Err Is Human.” Health Affairs Web exclusive, available at http://content.healthaffairs.org

Altman, D., Clancy, C., & Blendon, R. J. (2004). Improving patient safety — Five years after the IOM report. New England Journal of Medicine 351, 2041-2043.

Marx, D. (2001, April 17). Patient Safety and the Just Culture: A primer for health care executives. New York: Trustees of Columbia University. Retrieved on December 17, 2004, from www.mers-tm.net/index.html