Editor’s Notebook: The “Oops Center” and Patient Safety

 

March / April 2005

Editor’s Notebook


The “Oops Center” and Patient Safety

First, I want to say that the most important learning about patient safety is that systems — not individuals — cause errors and injuries. An institution that attempts to improve patient safety by accusing individuals of carelessness or poor judgment, or even by simply implementing a lot of rules, causes more harm than good. On top of the damage done to those who are blamed for errors, the institution loses out on the opportunity to learn from close calls and adverse events. Conversely, an institution that establishes a fair and just “culture of safety” encourages clinicians and others to report and evaluate problems and to learn how to mitigate future danger. To quote Dana-Farber Cancer Center’s Principles of a Fair and Just Culture (www.dana-farber.org/abo/news/tools/justculture.asp), “When events occur that cause harm or have the potential to cause harm,… a choice exists: to learn or to blame.”

That said, I was intrigued by a story I heard on National Public Radio last month about human reactions to committing mistakes. Under the headline, “Brain Region May Give Early Warning of Risk,” Jon Hamilton reported on a study of the anterior cingulated cortex, which is a part of the brain that is sometimes called the “oops center” for its role in anticipating and reacting to errors. Researchers from Washington University in St. Louis were looking for evidence of “cognitive control” when they asked volunteers to play a computer game that involved choosing to press a right- or left-pointing arrow in response to simple visual cues. The game didn’t always play fair, sometimes switching cues when there wasn’t enough time for the volunteer to respond with the correct action. Subconscious signals were included in the game, which allowed researchers to see that the volunteers’ brains quickly were trained to anticipate when they were likely to make a mistake.

I don’t mean to suggest that this research has potential for preventing medical errors, but I do think that whatever we all can learn about our individual response to and, even better, anticipation of unintended outcomes leads to more effective action. If only we could program ourselves more effectively to see trouble coming! The researchers, Joshua Brown and Todd Braver, published their study, “Learned Predictions of Error Likelihood in the Anterior Cingulate Cortex,” in the February 18, 2005, issue of Science (www.sciencemag.org).

New Advisory Board Members
With this issue, we welcome three new members to PSQH’s Editorial Advisory Board. Brian Shea, PharmD, FCCP, BCPS, is a senior manager and director of the National Patient Safety Practice at Capgemini Health. Dennis Robbins, PhD, MPH, is president of IDEAS, which stands for Integrated Decisions, Ethics, Alternatives and Solutions; and Richard Kremsdorf, MD, is president and CEO of CliniComp, Intl. All three are dedicated experts in the fields of safety and quality. We are grateful for their contributions to the magazine.