Error vs. Failure: Taking a Different Look at What Goes Wrong in Healthcare

By Jay Kumar

When things go wrong in healthcare, we look at the wrong things after the fact, according to a presenter at the Institute for Healthcare Improvement (IHI) Forum in Orlando earlier this month.

“We have focused on human error instead of complex sociotechnical systems,” said Matt Scanlon, MD, MS, CPPS, professor of pediatrics, Critical Care Medicine, Medical College of Wisconsin. “We pushed out the safety scientists…We focused on diagnostic error instead of diagnostic failure.”

Errors are never the cause of a failure, he suggested.

“We have fixated on human error for the last 20 years,” Scanlon said. “I know it’s a lot to swallow. We’ve been ‘error-this, error-that’ for a while, and it’s BS.”

The concept of error is useful, he added. It allows organizations to blame someone like RaDonda Vaught for a systems failure. “It creates the illusion of control,” noted Scanlon.

After you conduct an investigation into an incident, “if you’ve found the cause is human error, your investigation has failed.”

Scanlon said healthcare organizations have minimized two critical cognitive biases:

  • Outcome bias: If there’s a bad outcome, reviewers tend to incorrectly find blame
  • Hindsight bias: Knowing about hindsight bias doesn’t prevent you from falling into it

Sociotechnical systems

Healthcare has ignored the implications of sociotechnical systems, said Scanlon.

“We’re not teaching this in healthcare,” he added. “A sociotechnical system is a system that has people in it…You can’t have a people problem or a system problem. The people are the system.”

Healthcare is the most complex sociotechnical system, Scanlon said.

The Systems Engineering Initiative for Patient Safety (SEIPS) model was developed by human factors engineers to study and improve healthcare. It looks at how work systems affect health-related outcomes, such as patient safety, and can be used as part of research and improvement efforts. Scanlon recommended using the SEIPS model to study and make improvements to systems.

“The outcomes are more than the sum of the parts,” he said. “We make changes constantly to our work system…We’re holding people accountable and ignoring the rest of the system.”

The PETT (People, Environments, Tools, and Tasks) Scan tool provided with SEIPS 101 can be used as a classification system for extracting and analyzing data for barriers or facilitators to patient safety.

“Healthcare and diagnosing is messy,” Scanlon said. “Complex problems don’t have easy answers.”

There’s work as imagined vs. work in practice. “Our policies and procedures are written on imagined work,” he added.

He acknowledged that he doesn’t have the answers, but more research needs to be done using safety science to figure it out. The interactions drive outcomes, he noted.

“Only through honoring the complexity of our world and bringing back safety science are we going to get there,” said Scanlon. “I think we need to own that we can’t fix this even though we’d like to…we’ve pushed out the experts.”

But a good start will be to “stop saying error,” he added. Instead, “say failure.”