Why Worry About Near Misses?

 

September / October 2007

Why Worry About Near Misses?

Imagine you’re a nurse on the Code Team, rushing to respond to a patient who has just gone into cardiac arrest. As you enter the room and begin prepping the patient for defibrillation, you notice the yellow plastic bracelet on the patient’s wrist that indicates “Do not resuscitate.” You and the other members of the Code Team stand down. You start packing up the unused defibrillator as the patient is allowed to pass away. The floor nurse who had been caring for the patient — the one who called the code — is standing at the edge of the room wearing a pained expression. Finally she speaks up and says, “I don’t think this patient is DNR.” But what about the yellow wristband? “I think she’s a full code.” An immediate review of the chart confirms the patient’s full code status, and you again — now having lost critical time — set about rescuing the patient. Thankfully, you’re able to bring the patient back. You hope the neurological exam rules out an anoxic injury, but barring that, you’re relieved to have escaped a tragic error. Now, how did a patient who was a full code get a DNR wristband?

This is essentially the scenario, portrayed with some dramatic license, that played out in a Pennsylvania hospital in 2005, in which a patient was mistakenly designated DNR and later arrested (PA-PSRS, 2005). The Code Team nearly failed to resuscitate her until the mistake was realized. This error, like most, had several causes. The proximate error was that a nurse put the wrong wristband on the patient. In another facility where this nurse also worked, yellow wristbands were used to identify patients at risk for falling. However, this facility used yellow wristbands to designate patients with DNR orders. A contributing factor that enabled this error is that the wristbands included no text to indicate their meaning. An obvious systems-level issue that helped to create the conditions for this error is the lack of standardization across facilities for the meanings of these color-coded wristbands.

As a patient safety officer or risk manager, if this event happened in your hospital, wouldn’t you want to know about it? On a gut level the answer is obvious: of course you would. Yet, many hospitals have not committed to comprehensive and systematic collection of near miss reports throughout their facilities as part of their patient safety or risk management programs. Similarly, most state programs that collect patient safety data are standardizing on the National Quality Forum’s List of Serious Reportable Events in Healthcare (also known as the “Never Events”). This overlooks one of the key recommendations of the Institute of Medicine’s 1999 report To Err Is Human, which advocated studying near misses “to detect system weaknesses before the occurrence of serious harm” (IOM, 2000).

If your hospital collects only reports of adverse events and ignores near misses, you are missing out on the most valuable source of data for identifying patient safety priorities and for measuring your progress on the problems you’re trying to fix.

A Source of Actionable Information
As a result of the near-miss error with color-coded wristbands described at the beginning of this article, the Pennsylvania Patient Safety Authority issued a statewide advisory alerting the healthcare community to the potential for error and providing guidance on steps hospitals could take to reduce errors associated with this method of communicating important clinical information. In response, hospitals in Pennsylvania voluntarily formed the Color of Safety Task Force to develop a standardized approach to improving the safety of this practice based on the Patient Safety Authority’s guidance (PA-PSRS, 2006). Their Implementation Manual was issued on the Patient Safety Authority web site and has since become a model that is being adopted or adapted in other states including Arizona, California, Colorado, Nevada, New Mexico, and Utah.

In September 2006, national news organizations reported on six infants who received heparin overdoses — several of them fatal — in the neonatal ICU at an Indiana hospital. Heart-wrenching images of these babies’ grieving parents were broadcast repeatedly, and hospital officials were quick to accept complete responsibility for this awful error. At times like this, human nature cries out for someone to blame, and it is easy and in some sense satisfying for the public to point the finger at the nurse who administered the overdose or the pharmacy technician who stocked an adult concentration of heparin in the neonatal ICU drug cabinet. However, the talk among those of us in the patient safety community included a great deal of empathy for the people involved in that error because we understood that error could have happened in any hospital in the United States, and anyone could have made that mistake.

As safety experts we can look at this tragic event from the outside and hypothesize about some of the systems factors that may have been at work here. Nurses in the neonatal ICU handle dangerous drugs around fragile patients literally every day. Was there a double-check on this high-alert medication before the dose was administered? If so, did the double-check include looking at the medication vial? What is the culture around doing double-checks in this particular ICU? Is it a rigidly followed rule or an informal and sometimes optional practice? What about the process for restocking the drug cabinet? Was the adult heparin chosen incorrectly by the pharmacy tech at the time of refilling the cabinet, or had the selection of adult heparin occurred earlier in the pharmacy? If it was in the pharmacy, how and where are the adult and pediatric concentrations stored?

While no one looking at this tragedy from the outside can say with any assurance whether or not this hospital had an obvious opportunity to prevent this error, it doesn’t take much effort to imagine the types of near miss-reports in the months preceding the event that might have made a difference. For example, suppose adult heparin had been stocked in a pediatric unit before, but a nurse recognized the error when they retrieved the vial, and this prompted a review to determine how the vial had been stocked in error. Suppose a pharmacy technician, when stocking a cart for the neonatal ICU, had reached for the adult heparin but noticed the error immediately, and this got him thinking about how easy it would be to make that mistake again and how he might prevent it. Suppose the director of pharmacy saw a pattern of medication errors in this unit that suggested double-checks were being skipped; while no harm had occurred as a result of those errors yet, if the same errors occurred with more dangerous drugs, serious injury or death might result.

In the reporting systems ECRI Institute participates in, we often see reports that involve little or no harm to the patient but in which the potential for much more serious injury is clear. For example, in the Pennsylvania Patient Safety Reporting System (PA-PSRS) which we operate under contract to the Patient Safety Authority, we have published guidance based primarily on reports of near misses, such as:

 

  • Sandbags used to keep pressure on wounds, not recognized as containing metal pellets, have flown into an MRI core.
  • Introduction of non-radiopaque sponges into the OR when interventional radiology procedures are performed in the OR.
  • Clear liquids, such as Domeboro solution, in unlabeled containers being mistaken for drinking water or sterile water.

 

Many of the Hazard Reports issued from ECRI Institute’s Health Devices program are also based on reports of near misses or unsafe conditions, including the following recent issues:

 

  • During regular inspection of their laparoscopic insufflators, and ECRI Institute member hospital discovered defective carbon dioxide cylinder yokes that would have allowed cylinders of inappropriate gases to be connected to the insufflators.
  • Another hospital reported that three times over the course of ten months, an x-ray tube switch became stuck in the engaged position, so the tube did not stop moving. While no injuries occurred, the tube could have hit and injured someone.
  • Many hospitals have reported that their staff have used manufacturer-provided suitcase-style cases to store used endoscopes both before and after reprocessing. Because these cases and their foam packing cannot be disinfected or sterilized, storing dirty endoscopes in these cases even once can result in contamination of any reprocessed scopes later placed in them.

 

These are just the types of events that provide an opportunity to take action before someone is injured, and the Patient Safety and Quality Improvement Act of 2005, which establishes confidentiality and legal protections for information shared with Patient Safety Organizations, will make it even easier for healthcare organizations to share this type of information with one another.

Monitoring Processes and Behaviors — Not Just Outcomes
Every parent with a teenager learning to drive knows the importance of seatbelts. You want your son or daughter wearing a seatbelt whenever they’re behind the wheel of a car. While they can’t prevent accidents, seatbelts can certainly improve their odds of survival and can lessen the injuries they do sustain.

Do you monitor your child’s use of seatbelts anytime they’re in the car, or do you only note their seatbelt use during automobile accidents? Your hospital should be collecting near miss reports for the same reason.

Those of us who work in safety aren’t concerned only about outcomes. We’re also concerned about processes and behaviors: correcting those likely to lead to bad outcomes and reinforcing those likely to lead to positive outcomes.

Imagine being the patient safety officer at a hospital that’s just encountered its first wrong site surgery. You’ll no doubt be asking yourself, “How could this have happened here?” If you don’t collect reports of near misses in your hospital, you may start from the premise that this is an anomaly. You’ll look back over years’ worth of adverse event reports and encounter nothing that could have warned you this was coming.

Instead, if you encouraged reports of near misses and unsafe conditions in your facility, you might see dozens of relevant reports that make it clear how a wrong-site surgery could happen in your OR. You would likely see reports of:

 

  • Patients being brought to the OR without proper identification.
  • Discrepancies between the OR schedule and the consent form.
  • Site markings being done without the patient’s or surgeon’s involvement.
  • Regional blocks being performed on the wrong side.
  • Surgery being performed without the original diagnostic test results or films available.
  • Time outs being performed without all team members present, while the surgeon is in the next OR finishing another case.
  • Site markings obscured by surgical drapes during the time out.

 

Some of these things are more clearly near misses than others, but all of them would be useful warning signs. A discrepancy between the schedule and the consent that is identified during the time out and rectified before the procedure doesn’t reach the patient and therefore can be judged to cause no harm. The regional block performed on the wrong side is more equivocal. This is an error that reaches the patient, but has it caused harm?

Adverse Events Are in the Eye of the Beholder
It isn’t always obvious to clinicians when an event causes harm to a patient in a way that should be reported to a patient safety program, either internally to the hospital or externally. Hospitals need to make clear to all employees and medical staff what types of events need to be reported — both for the hospital’s internal quality improvement purposes but also so the hospital can meet the requirements of other mandatory or voluntary reporting programs they participate in.

If a patient falls from bed, strikes his or her head, and develops a subdural hematoma, everyone recognizes that as an iatrogenic injury. Most clinicians (but not all) would probably regard a laceration requiring stitches the same way. It becomes less clear with lower consequences, such as a laceration requiring steri-strips or bandaging. If a patient is given food to which they have an allergy, and they have an anaphylactic reaction which is quickly recognized and treated with epinephrine: is this an adverse event, or is it an adverse event narrowly averted? In a case reported to ECRI Institute’s Problem Reporting System, air was inadvertently injected into a patient using a contrast injector, which could have caused a fatal air embolus. The patient was treated with hyperbaric therapy and reportedly did not suffer any long-term injury.

But patient safety isn’t about just keeping patients safe from long-term or permanent injury. It’s about keeping them free from unnecessary risk, and events like those described above should preoccupy every patient safety officer, regardless of the level of harm the patient suffered this time.

It is often difficult to distinguish bad outcomes following proper treatment from iatrogenic injuries (or potential injuries) in which the care itself is the cause of injury.

If a patient suffers a perforated colon after a colonoscopy, the gastroenterologist who performs several cases each week will likely ask whether there was anything they could have done differently to prevent that perforation. If the answer to that question is no, they may consider it a known complication of the procedure and one that is no doubt on the consent form. While it is clearly not the outcome one wants, putting the label of “adverse event” or “iatrogenic injury” on it may seem to imply culpability on the part of the physician. This rationale can be applied to many complications, such as lacerations during C-section, post-op bleeding, or reactions to anesthesia.

However, many organizations have made remarkable strides in the past several years in reducing the incidence of chronic problems, such as pressure ulcers or healthcare-associated infections, that are too often seen as a “cost of doing business.” We are quickly expanding the scope of what is preventable, and as we do so, we start to see what were considered “complications” as patient safety issues. If you collect both adverse events and near misses, you don’t have to rely on the front-line clinician to make the judgment for you about what is worth monitoring.

Overcoming Objections
Hospitals that haven’t moved to near miss reporting may raise the following issues as reasons to stay focused on the more traditional adverse events and potential claims that have been the focus of risk management for many years:

“They won’t report near misses.”You may have to see this to believe it, but you can get clinicians to report their near misses. Less than 5% of the reports submitted to PA-PSRS involve significant harm to patients. A similar proportion of the medical device problems reported to ECRI Institute’s Problem Reporting Network involve injuries to patients. The strategy for getting people to report their near misses is the same one you use to get them to report adverse events. You have to:

 

  • Explain the scope of events you want them to report and why it’s important.
  • Continuously show them the changes occurring on the basis of the reports they submit.
  • Always reward them and never penalize them for reporting.

 

“They won’t report all the near misses.” That’s right, they won’t. The truth is, clinicians don’t report all the adverse events that happen either, so it would be foolish to think they will report all the near misses. Fortunately, you don’t need them to report all the near misses in order to get actionable information from the ones they do report. You only need enough reports to recognize that you have a problem.

“We don’t have the time or resources to deal with everything.” Any activity you undertake in any area of life has limited time and resources. This forces us to prioritize the goals we reach for and the means we can use to achieve them. You will never be able to tackle every safety issue you identify. But if you aren’t collecting as much information as you can about the threats to safety throughout your facility, you are leaving to chance the decision about what to prioritize rather than making that decision yourself.


William Marella is director of patient safety reporting programs for ECRI Institute. He helped develop and manages the Pennsylvania Patient Safety Reporting System (PA-PSRS) under contract to the Pennsylvania Patient Safety Authority. ECRI Institute has developed and operated patient safety reporting systems on behalf of government agencies, healthcare organizations, and the clinical community. Marella may be contacted at WMarella@ECRI.org.

References

Institute of Medicine (IOM). (2000). To err is human: Building a safer health system. L. Kohn , J. Corrigan, M. Donaldson (Eds.). Washington, DC: National Academy Press, 8.

National Quality Forum (NQF). Serious reportable events in healthcare: 2005-2006 update. NQF Web site. Available at: http://www.qualityforum.org/projects/completed/sre/

Pennsylvania Patient Safety Reporting System (PA-PSRS). (2005, December 14). Color-coded patient wristbands create unnecessary risk, 1-4.

Pennsylvania Patient Safety Reporting System (PA-PSRS). (2006, August 9). Update on use of color-coded patient wristbands, 3(S1), 1-4.