Ebola: A Crash Course in Reliability

ebola-and-patient-safety-implications

By Susan Carr

Ebola: A Crash Course in Reliability

Our responses to the news that Ebola had been diagnosed in the United States for the first time reveal gaps in our understanding of how to protect others and ourselves from Ebola and other infectious diseases. When we overreact in fear and take comfort from actions that don’t actually make us safer, we may overlook aspects of our systems and institutions that really do put us at risk.

The case of Thomas Duncan, diagnosed in Dallas with Ebola on September 26, 2014, reveals how unreliable our systems can be, especially under stress. The actions of Texas Health Presbyterian Hospital Dallas, where Duncan went for emergency care when he first became ill, reveal broad problems with implications that reach beyond the immediate response to one patient with Ebola.

By now, the story is well known if not well understood. The hospital seemed flustered as it tried to explain why it had sent Duncan home following his first visit to the emergency department. He had presented with fever, headache, and stomach pain days after arriving in Dallas from Liberia. Communication and poor training seem to be root causes of the medical staff’s inadequate response. Duncan’s family and others in Dallas are asking also if racial or socio-economic biases contributed to delaying his treatment and, inadvertently, putting others at risk. Poor communication may also explain ineffective screening at the airport in Monrovia, Liberia. When Duncan was asked if he had had contact with Ebola, he said “no,” and we don’t know if he lied or didn’t understand.

After Duncan returned to the hospital in Dallas two days later and was diagnosed with Ebola, the hospital engaged in serial finger pointing, apparently an attempt to find someone or something to blame for having released him the first time. That was an unfortunate demonstration of the ineffectiveness of “shame and blame” tactics. Scapegoating does not inform and gets us no closer to understanding what went wrong and what we need to do to prevent something similar from happening in the future. Ebola is a serious communicable disease, and we understand how to control its spread. Harm caused by dysfunctional systems in hospitals, however, remains devilishly difficult to prevent.

In the Dallas News, Randy Lee Loftis explains how the Duncan case demonstrates that “even sound plans can fail spectacularly.” He observes that humans are prone to error, and communication should never be taken for granted. Emergency preparations must include repeated, realistic hands-on training. Distributing a checklist by email doesn’t count. One of Loftis’s sources, consultant Ron Ashkenas, speculates that members of Presbyterian’s emergency room staff may have been in denial about Ebola reaching the U.S., not taken preparations seriously, and were therefore not able to recognize the threat when it arrived looking for help. Loftis quotes Ashkenas saying, “Psychologically, most of us feel that it can’t happen to us, or it won’t happen here. Africa is far away, and Ebola seems like an abstraction. So when the symptoms show up, at some level we don’t actually believe it.” Denial is, obviously, no longer an option.

Screening airline passengers entering the United States from countries in West Africa is another example of ineffective prevention measures, more likely to make us feel safe than to be safe. In fact, the false sense of security gained from actions such as temperature screening at airports only makes us more vulnerable. Reporters including Julia Bellus and Steven Hoffman writing for Vox remind us that Canadian researches found airport screening had been “inefficient and ineffective” in preventing the spread of SARS in 2003.

Larry Gostin, professor of global health at Georgetown University, recently told Anders Kelto of National Public Radio about a personal experience that underlies his skepticism about the efficacy of airport screening. Onboard a flight to Beijing during the SARS epidemic, Gostin observed flight attendants distributing Tylenol to first-class passengers and urging them to take it so as to pass temperature screening on arrival in the airport. In that case, while screening may have helped some people feel safer, it increased the risk of infection.

The Ebola crisis is an opportunity to revisit the lessons of reliability. In “Managing the Unexpected: Resilient Performance in an Age of Uncertainty,” Weick and Sutcliffe describe reliable ways to stay safe in hazardous and constantly changing conditions:

  • keep track of failures, even the small ones, and learn from them;
  • resist the temptation to oversimplify
  • pay close attention to “operations,” how people on the frontline are doing their work;
  • stay alert and vigilant; and
  • seek those who have real expertise needed for specific circumstances; that expert is not necessarily the obvious “authority” figure.

Following Weick and Sutcliffe’s recommendations would improve our response to Ebola and improve delivery of care and safety for all concerned.