November / December 2005
Proceedings from the Quality Colloquium
Patient Safety Officers
Roles and Responsibilities
Is it realistic to expect to reduce mishaps by 50% in five years? Yes! In 1980, after being commissioned as an aircraft accident investigator in the newly formed Israel Air Force (IAF) Flight Safety and Quality Inspection Directorate (FSQID), my assignment was to perform non-punitive mishap investigations to prevent reoccurrence of aircraft accidents. From 1968 to1978, the accident rate held strong, and most mishaps were repeats of similar events. Pilots were being punished, grounded, demoted, and even put in jail for their errors. This practice did nothing to prevent the mishaps. There was no formal training for flight safety officers who were rank-and-file pilots in their squadrons.
Most mishaps, as could be expected, occurred during training. In 1980, the FSQID rewrote their policies and procedures, aircraft accident investigators were sent to the United Kingdom and the United States for professional training, and a computerized coding and classification system for mishaps was designed and implemented.
This new process for safety investigations used the 5M Model (Wilf-Miron, et al., 2003) to examine the interfaces among factors management, mission, man, machine, and medium and found that failures often occurred within the system itself. The recommendations devised were for corrective and preventive action, not punitive action. Wing safety officers received training to become part of the near-miss debriefing processes as close in time to the occurrence as possible. From these activities and the feedback to the squadrons, reporting went up, and mishaps, over a five-year period, went down by 50%. This change occurred because the charge of the newly created FSQID to conduct safety investigations within a non-punitive organizational culture was effective. Investigators were trained and empowered, polices and procedures were rewritten, and the safety culture evolved.
The Role of the Patient Safety Officer
The role of the patient safety officer in healthcare is similar to what my role was in the IAF. Another example of performance and safety improvement, from the U.S. Navy, is the remarkable turnaround achieved by Captain D. Michael Abrashoff on the USS Benfold. Abrashoff transformed the "worst" ship in the Navy to the best. He led by example, listened aggressively, built up his crew, and improved their overall quality of life. Abrashoff's book (2002), It's Your Ship, tells the story. In his own words, part of his success was because he got the sailors aboard the U.S.S. Benfold to make an "enthusiastic commitment to our joint goal of making our ship the best in the fleet." CEOs and their patient safety officers who say, "It's your hospital," can similarly inspire and empower their own people to do remarkable things.
People will freely question conventional wisdom, come up with better ways to do things, and improve their own jobs. They will take calculated risks, trust each other, and go one step beyond their job requirements to add value. However, they need to be openly recognized and appropriately rewarded for their efforts and be able to work within a culture of trust. To bring about these much needed changes in healthcare administration and practice, it is important to focus on the conditions that allow positive events to propagate within a culture of safety. Of equal importance is the work of identifying barriers, such as punitive safety investigations, CIOs who do not want to be part of the team, and CEOs who do not hold themselves personally responsible and accountable.
Doing It Right the First Time
It is impossible to motivate people to do what is right without exception. The Institute of Medicine (IOM) report To Err Is Human (2000) asserts that the problem is not bad people in healthcare; the problem is that good people are working in systems that need to be made safer. The report offers a clear prescription for raising the level of patient safety in American healthcare. It also explains how patients themselves can influence the quality of care they receive. Involving people in the decisions that affect them, determining the caregiver's competencies, enabling them to do what they do best, and listening to their concerns are good first steps.
Effective leadership is never easy. However, if you take the time to do it right the first time, you won't have to deal with the clean up. Unless people buy in to what they are doing, and do it consistently, well-designed systems and all the technology in the world cannot create an absolutely safe environment. As one administrator said, "Being a PSO requires a head of steel, a heart of gold, strong shoulders, and ability to pass credit to others." Unless the PSO initiates and supports change, nothing will happen. The pattern remains the same: solve a problem and set up a process, only to have someone else work around it. Then the cycle repeats. There is no time to continually relearn the same information.
Change A Risky Behavior
Change is not always progress; sometimes change makes things worse. Everyone fears the unknown aspects of change. It is always easier and safer not to change. Change disrupts equilibrium or the status quo and causes turmoil. Change often increases workloads. Change is truly risky behavior. Problems that have developed over long periods of time cannot be solved in several hours or weeks. Most goals are set in terms of one to five years, but objectives and tasks are set in terms of weeks or months.
Barriers to Reporting
Barriers to the standardization of patient safety data systems include legal, regulatory, financial, technological, political, lack of authorization, lack of good models, and lack of evidence of impact. Our efforts must be directed toward transforming the current culture of blame and resistance to a culture of learning about and increasing safety. A first step is to understand the balance of barriers and incentives to reporting. It is important to overcome the negatives of reporting adverse events, such as skepticism, fear of reprisals, extra work, and lack of effectiveness of present reporting systems. Positive aspects of reporting that involve some degree of immunity include confidentiality and the opportunity to learn from reporters who share their stories with other healthcare professionals. The incentives to the individual, organization, and ultimately for society include accountability, transparency, and sustained trust and confidence in the healthcare system. Incentives must be tied to higher governing values. Otherwise, fears and attitudes will limit the usefulness of the structural incentives currently in place.
On July 29, 2005, President Bush signed the Patient Safety and Quality Improvement Act of 2005. This act is a direct result of Recommendation 6.1, page 10, in To Err Is Human. President Bush said, "This act will help ensure that Americans continue to benefit from the greatest medical system in the world." To maintain the highest standards of care, doctors and nurses must be able to exchange information about problems and solutions. Yet in recent years, many doctors have grown afraid to discuss their practices because they worry that the information they provide will be used against them in a lawsuit. The president stated, "This bill will help solve that problem." He said that this is a common-sense law that gives legal protections to health professionals who report their practices to patient safety organizations. By providing critical information about medical processes and procedures that are inadequate and have the potential to fail, doctors and nurses can help others learn from their experiences. These barriers to reporting will be lowered and essential information will be made more available across America, helping to ensure that patients benefit from the best medical treatment possible.
Why Report Near Misses?
Near-miss reporting is an extremely rich source of information about what works and, especially, what doesn't work. When a reporting system is easy to use, people will use it. However, considerable training is often required. An analysis of near misses reveals that there are fewer barriers to data collection when no injury has occurred. Recovery strategies can be studied to enhance proactive interventions. Hindsight bias is effectively eliminated when there is no patient harm, as there are no legal or administrative recriminations. Major contributing factors for the lack of near-miss reporting are fear of disciplinary action, lack of understanding of what constitutes a near miss, lack of senior management commitment, lack of incentives to report near misses, and disincentives or punishment for reporting near misses. Monetary rewards given to the rank and file who report near misses bring about positive results. It is important to work in teams, and to keep a sequence of steps moving in the right direction so that no one drops the ball. Unfortunately, many cynical physicians think patient safety is not a problem or that it does not apply to them.
The good news is that near-miss reporting appears to be gaining acceptance in the heathcare industry, and barriers to near-miss reporting are being recognized and addressed. The experiences of one medical director, accumulated over a 12-year period, indicate that many consumer complaints are actually about safety issues. She finds that it is difficult to separate quality and safety. An institutional patient safety task force she has created in her hospital includes nurses, clinicians, pharmacists, and quality professionals. The hospital CEO and the medical director co-chair the task force, and multidisciplinary groups are focusing on the issues and initiating error-prevention projects. At this facility, anyone who has an issue to report can use an Internet-based prototype patient safety hotline to tell his or her story.
Some positive signs are that people recognize there are problems, and many healthcare providers are figuring out ways to solve them. The organizational culture is also changing for the better. Use of checklists and pre-op and post-op team meetings are making a difference. Applying aviation models to patient safety in peri-operative areas and improving communication has improved patient safety.
In conclusion, PSOs must have a passion for good patient outcomes. They must like to fix things. PSOs have to persevere and be patient until others join forces with them. They must celebrate victories when they come and not lose heart when discouraging things happen. Sometimes the hospital culture is changed one disaster at a time.
For the presentation on which this article is based, several chief medical officers and risk managers were interviewed. They provided many invaluable insights. I appreciate being given the opportunity to share with readers the input of these dedicated professionals.
Douglas B. Dotan (firstname.lastname@example.org) is president and founder of CRG Medical, Inc., in Houston, Texas. He has more than 30 years of experience in quality, risk, human factors, and safety, beginning with service as a pilot and accident investigator for the Israeli Air Force. Dotan is also certified as a flight safety officer and aircraft accident investigator by the U.S. Air Force and the University of Southern California Institute of Safety and Systems Management. CRG Medical provides healthcare knowledge management and modeling methodologies and tools to assist risk managers and patient safety officers analyze information, data, and facts to achieve results at a low-cost in human and financial resources. Dotan is program chair for the Healthcare Division of the American Society of Quality's conference, World Congress for Quality and Improvement, and is a member of the Editorial Advisory Board for Patient Safety and Quality Healthcare.
Institute of Medicine. (2000). To err is human: Building a safer health system. L. T. Kohn, J. M. Corrigan, & M. S. Donaldson (Eds.). Washington, DC: National Academy Press.
Wilf-Miron, R., Lewenhoff, I., Benyamini, Z., & Aviram, A. (2003). From aviation to medicine: Applying concepts of aviation safety to risk management in ambulatory care. Quality & Safety in Health Care 12, 35-39.