RECENT FEATURES & COLUMNS
MOST READ ARTICLES
- Lean Transformation and Culture Change
- Getting a Feel for Better Infection Control
- MRI Safety 10 Years Later
- Data Trends: High-Alert Medications: Error Prevalence and Severity
- Data Trends: Epidemiology and Impact of Patient Falls in Healthcare Facilities
- The Clinical Documentation Specialist: A Key Member of Quality and Patient Safety Teams
SPONSORED EVENT LISTINGS
Creating a Culture of Patient Safety
Virginia Mason Institute
Join Virginia Mason Institute for this 2.5-day workshop and learn how to accelerate your safety efforts using lean methods. Assess your own organization’s readiness and practice simulations that turn uncomfortable team dynamics into patient-centered communication. Explore best practices that establish reliable systems, nurture staff engagement and lower risks for patients.
For more information please visit http://www.virginiamasoninstitute.org/creating-a-culture-of-patient-safety
July / August 2009
Medication Safety Technologies:
What Is and Is Not Working
Almost ten years ago, the Institute of Medicine report, To Err Is Human (2000), galvanized healthcare, patients, Congress, and the media to pay attention to the problems of patient and medication safety. In the years that followed, hospitals made enormous investments to improve practice and implement safety technologies. How far have we come? Where are we now? Is there any way to anticipate what happens next?
To help hospital leaders and clinicians answer critical questions about safety, on September 26, 2008, Drs. Bates and Wachter, two of the world’s leading experts on medication safety, shared their perspectives in a nation-wide, interactive audio conference that drew listeners from more than 1,100 sites across the United States. The 90-minute program and Q&A was made possible by an unrestricted educational grant from the Center for Safety and Clinical Excellence. Tim Vanderveen, PharmD, vice president of the Center, posed key questions and served as moderator. The following interview is taken from that presentation.
Dr. Vanderveen: Ten years after the Institute of Medicine (IOM) report, To Err Is Human, do quality and especially safety still have traction within our healthcare system?
Dr. Wachter: I think what we’re seeing is a revolution. There are no signs of anything but continued acceleration and pressure to improve quality and safety. Prior to 1999 there was no incentive to work to improve safety other than professionalism and ethics. Those are powerful incentives, but not powerful enough to change the system the way it needed to be changed.
In the last nine years, we’ve seen tremendous pressure coming from many fronts, whether it’s accreditation, information technology, the regulatory environment, or transparency. It’s working little by little, but it’s harder than I think people thought it would be. We’re seeing some missteps, overreaches, and unanticipated consequences. But there’s no question in my mind that the safety and quality movements are for real, and that the pressure to get it right is only going to accelerate. To me that’s a good thing.
Dr. Vanderveen: The IOM report recommended several threshold improvements, including a 50% reduction in medical errors in five years. Almost ten years later, what is your report card on meeting the IOM goals?
Dr. Bates: With respect to the 50% reduction, the truth is that we don’t really know, because we don’t have good metrics for sorting out how common medical errors are in most institutions. We do know vastly more about the epidemiology of safety than we did, and hospitals are starting to take the steps that it will take to substantially improve safety.
Safety is a lot higher on the priority list of hospital leaders and of boards than it was eight or ten years ago. A national study done around that time showed that safety was something like 57th on the priority list of a large group of hospital CEOs. Now when our CEO talks with the key supporters of our hospital and our board, safety is right at the top of the list. We’re making progress; we just haven’t really figured out exactly how to measure that progress.
Dr. Vanderveen: Your study from 1995 showed that prescribing was associated with 39% of medication errors; transcribing, 12%; dispensing, 11%; and administration, 38%. Typically only 2% of errors in the administration phase would be detected. Do you think these statistics have changed?
Dr. Bates: In hospitals that have implemented computerized prescriber order entry (CPOE), I would suspect that the proportion of errors related to prescribing would be smaller and certainly the percentage related to transcribing would be minimal. The same is true for technologies that improve the safety of dispensing and administration. But we don’t have many recent, comparable studies in hospitals that have these technologies in place. Doing some studies like that would be very interesting.
Dr. Vanderveen: Is it possible to change the system as it relates to the medication use process? If so, who will pay for it?
Dr. Bates: I definitely think it’s possible to change the system. When we started our studies, technologies such as CPOE, barcoding, smart pumps and monitoring for adverse drug events either did not exist or were used in only a few places. All those technologies can dramatically change the game in this area. Another frontier is medication reconciliation, bringing in data from outside the hospital to the point of care so that it is available to providers.
This interview occurred before the passage of the Stimulus Package, which provides substantial federal support for IT implementation.
Dr. Wachter: The issue of measurement is incredibly important. David has been the world’s leader in trying to define the number of medication errors and their patterns. As he notes, it is awfully hard to measure them.
The difference between quality and safety is often subtle, but I think this is one area where the difference is profound. By looking at administrative data or reviewing charts, I can easily determine whether a patient with a heart attack was given aspirin or a beta blocker, or what their door-to-balloon time was for an angioplasty. But from that kind of review I really have no good way to figure out whether there was a medication error, a handoff error, or virtually any kind of error.
Strategies such as transparency and putting performance information on the Web work surprisingly well in quality, because the data can be collected in a fairly standardized way and posted for everybody to see. That seems to generate change.
Those strategies don’t work very well in safety, because we don’t have the same ease of measurement. For example, if at one hospital the number of incident reports in the last year has gone up by 20%, what do they say? “Terrific. This shows we’ve generated this culture of safety, the reporting system is working, so we’re getting safer.” At another hospital incident reports have gone down 20%. What do they say? “We’re getting safer; there are fewer incidents being reported.” We don’t know which is true. This measurement problem, which might seem a little bit arcane and wonkish, turns out to be extraordinarily important as you search for policy interventions to make safety improvement work.
In terms of the dominant tensions in the field, I think we’re in an incredibly interesting phase. After the initial IOM report, the mantra was mostly “No blame.” We now know that most errors are committed by good, competent people trying to do the right thing. The right strategy is not to blame them, because that pushes the errors underground and we don’t learn from them. But the mantra now is “accountability,” meaning that it’s somebody’s fault if we don’t fix this. How do you reconcile “no blame” with “accountability”?
I think the answer is that “no blame” remains the appropriate strategy for frontline providers who slip and make mistakes because they’re human. But it’s very clear that the public has absolutely no appetite for “no blame” being the dominant philosophy when they look at a hospital and figure out that there are too many medication errors, people aren’t washing their hands, or you name the safety hazard. The dominant model that we’re going to be operating under is one of intense accountability, with more public reporting, more transparency, and more accreditation pressure.
Who’s going to pay to do all of this? I believe David is completely right. It was a pipe dream that somebody would swoop in with a big pot of cash and say, “We understand your plight and here’s extra money for you to computerize, do safety training, or buy simulators.” That’s not going to happen.*
The hospitals will pay for this when it becomes in their interest to do so. All the hesitations about computerization seen to melt away when it becomes more expensive not to be computerized than to be computerized. How does that happen? Pay for performance might get us part of the way there. But when you’re under intense regulatory pressure and are reporting all your errors to the state, if you have a major medication error, the state will come in. To fix it, a hospital may have to hire 50 full-time-equivalents (FTEs) of people to scrutinize every doctor’s order to be sure that nobody used an unapproved abbreviation. At some point it becomes cheaper to have CPOE.
That’s the environment that we find ourselves in. If you look at all of the pressures on hospitals, many are designed to shift the business case, to make it in their financial interest to focus on safety in new ways, whether that’s training, staffing, or computerization.
Dr. Vanderveen: Dr. Wachter, recently you wrote that your priorities on implementing patient safety technologies had changed. Can you share some of the thinking that led to your commentary?
Dr. Wachter: I’ve been thinking a lot about this over the past couple of years. I consider myself an amateur in this, and David is a pro. But here is what I’ve seen.
Until the last year or two, the recommendation I heard was that if you were going to stage your implementation of computerized technologies, CPOE should come first. It delivers the biggest bang for the buck, it’s the master technology, and others should follow. That made a lot of sense. But I think part of the reason CPOE was positioned first was that most researchers in safety generally, and in medication safety in particular, have been physicians. To them prescribing errors seem the most palpable. A doctor’s handwriting is sort of iconic as a cause of safety problems. So it was natural to focus on the prescribing process as the key part of the problem. Obviously, it is extraordinarily complex and error prone. Not only are there issues in prescribing itself, but it becomes the scaffolding for computerized decision support—which may ultimately provide the biggest bang for the buck of all.
Second, I began to realize that much of David’s work was done in the Brigham and Women’s Hospital, which has been working on computerization for almost a generation. The question was, would their results translate to the rest of us who are going to buy vendor systems, put them in, and turn them on? To my mind, that remains an open question, although I think David will help educate all of us.
Third, I think that developments other than CPOE may have made the prescribing process better and safer. Many physicians can easily look up drug interactions, doses, etc. on their PDA or smartphone. Also, to the extent that people are following the National Patient Safety Goals such as eliminating certain high-risk abbreviations, prescribing has gotten safer, independent of computerization.
Finally, if you think about the medication management process, there are many opportunities for an upstream error such as in writing a prescription to be caught downstream, perhaps by a pharmacist or by a nurse at the point of care. It seems that a disproportionate number of the most critical errors are administration errors, e.g., giving a patient the wrong pill or an incorrect intravenous (IV) dose through a pump. I think it’s partly because there’s no one to catch those errors right at the end of the medication use process. I also began to think about the extraordinary expense and complexity of implementing CPOE. Solutions such as smart pumps, barcodes, etc., are probably easier to bite off than transforming the entire prescribing process.
Thinking about all of that—this is completely non-evidence based, just one person’s musing—I might start with one of the point-of-care solutions, either barcoding or smart pumps, and work out from there. I don’t know whether that’s right.
At the end of the day, of course, the whole medication management process has to be computerized and all the pieces linked together in a seamless way. Ultimately we all have to have CPOE and barcodes and smart pumps, they all have to weave together seamlessly, and there has to be terrific, embedded decision support. It’s not a question of whether you should implement one and stop. It really is a question of how do you get started?
Dr. Vanderveen: Dr. Bates, as someone who has been at the forefront of basically all the medication safety technology and adoption, what are your thoughts on setting priorities?
Dr. Bates: I disagree with Bob a bit, although there is much we agree about, too. The older epidemiological studies of harm are the principle reason that I disagree. Those show that most of the harm that comes to patients comes from prescribing errors of various types.
I really don’t think that prescribing has gotten that much safer. We did a study in six community hospitals in the Boston area and found that the frequency of adverse drug events was even higher than in the older studies and that many more of the adverse drug events were preventable. Especially important issues were using drugs in patients with renal insufficiency and not considering laboratory tests when prescribing to patients. The best evidence is for CPOE, especially compared to barcoding and smart pumps, where the evidence is much less strong.
I do agree that we still don’t have good evidence about the impact of CPOE using vendor applications in settings such as community hospitals. From an academic hospital, a number of studies that show that CPOE does a very good job at reducing medication error rates. The evidence about preventable adverse drug events is still equivocal, because large enough studies have not been done.
That being said, if I had a finite amount of money, the order that I would do things in would be CPOE, then barcoding or smart pumps. But as Bob noted, in terms of ease of implementation the order is quite different. Smart pumps are by far the easiest of these technologies to implement. A number of institutions were able to install them across the whole institution over a 12- to 24-hour period. It took us a year to two to put in barcoding, but the biggest challenge was to generate the capital to do it. CPOE is by far the hardest to implement.
At the individual hospital level, I think that the choice of which technology to do first depends a lot on your information systems, the assets you have available, and the cohesiveness of your medical staff. At the end of the day, you want to do all of these. That’s really the important message.
It is going to be hard for hospitals to gather the necessary resources. The price tag for each of these things is not inconsequential, and they really should be linked. That has not been done at many places today.
Dr. Vanderveen: What are your thoughts on the relative value, either proven or perceived, of safety measures such as CPOE, pharmacy review, and drug/drug interaction software that are implemented further from administration versus the “sharp end” safety measures such as positive patient identification, barcode medication administration, and smart pumps?
How important is the need for evidence in adopting new technologies that often seem intuitive? For example, scanning a patient’s barcode wrist band and matching the patient to the medication, or providing an alert that the infusion pump is programmed to give a tenfold overdose and preventing that.
Dr. Bates: For CPOE the evidence is now reasonably robust, although there are certainly many additional questions that need to be answered. How big is the benefit? What happens using various vendor systems? What are the most important types of decision support to include? To get the benefit that you want with CPOE, you have to have a good bit of decision support. I’ve worked with Leapfrog and First Consulting Group to help develop an evaluation for CPOE applications that will help hospitals assess their decision support (Metzger et al., 2008 July/August). Many hospitals don’t have a good sense of how much they have in place when they implement CPOE. It’s tricky, because if you put in too much at the beginning, that can affect adoption. The expert advice is to start with relatively little decision support and then ramp it up. Pharmacy review has been around for awhile, and the existing evidence is reasonably compelling. It makes sense to just continue doing that.
Overall, I think the importance of drug/drug interactions is quite a bit overrated, in that hospitals will want to only show the ones that are most important. If they show too many, that can really create problems. It creates a lot of extra work for the pharmacy. If you do it with CPOE, it can even cause failed adoptions. There have been a number of those around the country, because it is possible to show too many alerts. That has a couple of downsides. People find it annoying. But second, it may make them even more likely to overlook the truly important alerts.
The sharp end measures are also extremely important, in part for the reasons that we talked about earlier. If you make an error at the sharp end, it is going to make it through to the patient. Smart pumps are especially important, because many of the agents that are given by smart pump are especially likely to cause problems. If you give somebody a tenfold overdose of many of the drugs used intravenously, it can be very bad. I think that many of those errors just were not detected in the past and were written off to some other problem that the patient was having.
I do think it’s important to have evidence, even around adopting new technologies that seem intuitive. Hospitals have very scarce resources and are desperately trying to prioritize these various technologies—and not just the ones that we’ve talked about. Hospitals are also thinking about buying a new MRI scanner, a new PET scanner, etc. Having evidence to justify the benefits of medication safety technologies is going to be very helpful in getting hospitals to actually adopt them.
Dr. Vanderveen: Dr. Wachter, you recently wrote about the “hype cycle” of emerging technologies. Using CPOE as an example, can you take us through the various stages, starting from the “Technology Trigger” and ending up at what is called the “Plateau of Productivity”?
Dr. Wachter: The “Technology Hype Cycle” (Figure 1) was popularized by the consulting company, Gartner, a few years ago. CPOE is a pretty interesting example, in that it is following along that curve. The technology was developed in the late ’60s or early ’70s and began to get hyped fairly quickly, in part because it’s a compelling story. It makes a lot of sense to people and it should work. The early research on it by David and others was extraordinarily positive and led people to feel that this technology is absolutely essential and works quite well. The endorsement of CPOE in 2002 by the Leapfrog Group as one of three safety standards may have been the height of that cycle, when it was being promoted by a national organization as a “must have.”
Figure 1. The Technology Hype Cycle (Gartner, Inc.)
New technologies tend to follow a predictable path. Tremendous initial excitement surrounds the launch, but few technologies live up to this hype, meaning that the peak is inevitably followed by a trough. But if the technology is any good, some hearty individuals and organizations will say, “This is probably not as good as it looked in phase two (the peak of the hype), but it’s not as bad as it looks in phase three (the trough of disappointment). Let’s keep working on it, let’s refine it, let’s improve the technology, let’s do research to demonstrate its value.” Technologies that make it through the cycle end up in a realistic, useful place where they get adopted, because they actually are making the world a better place. This process has already been seen with CPOE, and a similar progression will be seen with barcoding, smart pumps and other new technologies. Knowing the cycle can enable you to anticipate the next phase.
Over the last four or five years as we look at the literature, CPOE is going through a bit of the trough. The much-critiqued study out of Pittsburgh Children’s Hospital that showed an increase in mortality rate after the adoption of CPOE was part of that. Literature in the last several years about unintended consequences, workarounds and the experiences of organizations that have had difficult or failed implementations have all led to some disillusionment with the technology. And the technology itself needed to improve.
Frequently as part of this cycle, the initial technology only gets better after people work on it and get feedback from users. I think we’re now entering these last two phases where people are more realistic about how challenging it is to implement CPOE. The systems are getting better, hospitals are having successful implementations, beginning to talk about that, and the research is getting better. So I think we’re on our way toward the Plateau of Productivity. As you look at the implementation numbers, you see they’re beginning to go up more and more with the quality of the systems. You’re starting to see some consolidation in the industry, and the winners are developing better systems, far more user-friendly than ones introduced even as recently as five years ago.
What is interesting to me about the cycle is that you can play it out with almost any technology. I think we’ll see similar things with barcoding, smart pumps, and other new technologies that are coming down the pike. It’s worth knowing the cycle because you can anticipate the next phase.
Dr. Vanderveen: What are some of the lessons learned about the successful implementation of technologies such as smart pumps, which add new capabilities to replace pumps that have been in use for over 40 years, and of others such as BCMA and CPOE that are completely new?
Dr. Wachter: The lessons often relate to the challenges of the interface between human behavior, human frailty, and technology. Some of the key lessons are that everything is harder than it appears to be when a technology is being used by the pioneers. Early studies are done at selected places with people that are very passionate about the technologies and everybody is onboard. Those results often outstrip what you see when a technology gets rolled out nation-wide.
We’ve learned that initial budgeting for implementing technologies such as CPOE was inadequate. People didn’t really understand the importance of having backup systems and decision support and of educating everybody. The budget has to include not just the people on staff today but all the new nurses, travelers, and possibly residents every year. I think people have gotten much more realistic about budgeting for the costs of these sorts of things and for decision support.
People have also become much more realistic about anticipating “unanticipated consequences” that are completely predictable. There is recognition now that there is an engineering aspect to technology implementation and a socio-cultural aspect. A new technology or a new process can look really good when you’re sitting in a conference room or in your laboratory developing it. But it is absolutely critical to go to the workplace to see what happens when users try to implement it. More times than not, you see workarounds and unintended consequences. You have to be open to what you can learn from monitoring user behavior in adopting the technology.
Dr. Bates: I would largely agree. I hope with respect to CPOE that we are through the Trough and heading up towards the Plateau. I think the key thing that came out of the Han study from Pittsburgh (2005) was that if you implement badly, it can have serious consequences. There are lots of questions about that study, but the mortality rate amongst kids who were transferred in actually went up after implementation of CPOE (Han, 2005). Subsequently, another site used exactly the same commercial application, did the implementation better, and the mortality rate went down (Del Beccaro, 2006). The decrease was not statistically significant but was certainly clinically significant.
Some of the messages that I take away from the unintended consequences literature are that it is absolutely essential to measure the new problems that you create and have a systematic approach to going through and fixing those in an ongoing way. For example, at one time or another we had virtually all of the issues with CPOE at the Brigham that Ross Koppel noted at the University of Pennsylvania (2005), but we’d gone through and we fixed them.
Looking back, I have to say that we didn’t devote enough resources at the beginning to do that sort of thing. Our thought was that we would be able to take the technology, put it in, and then be done and move on to other things. With CPOE, anyway, that’s not the way that it works. With some of these other technologies, you don’t have to do quite as much on an ongoing basis, but you have to maintain all of them.
Dr. Vanderveen: A new technology also brings with it new sources of information. Dr. Bates, you’ve referred to the smart pump logs as a “treasure trove” of information about the use of IV medications. What are some of your “ah, ha” moments when you look back at what you’ve learned from these technologies?
Dr. Bates: With both smart pumps and barcoding, you can generate a database, which is then unbelievably helpful in terms of diagnosing problems with your process and then making it better. For example, when we looked at the smart pump data, we found an instance in which a nurse was trying to give too high a dose of amiodarone. She tried nine different times to get around this warning. Eventually she just entered the drug as a basic infusion to circumvent the decision support and give the patient too much. That was very helpful to me in terms of understanding what mental processes were going on. With the smart pumps, we found many examples of bolus dosages administered from continuous infusions, often without setting a volume, that we never would have known about otherwise. We had no clue that we had as big a problem with bolus dosages as was actually present. We also found that many of the IV drugs that were being infused in our units actually had no orders at all. That was a kind of shocking thing to me. So these technologies can be really helpful.
On the barcoding front, we recognize that there are many questions we want to answer about why providers are or are not using the barcoding in the way that we’ve wanted them to. We’ve not made it as easy as we should have to get that information out of the database. So it really does pay to look at these things. You get the answers to all kinds of questions that you didn’t even know you had.
Dr. Vanderveen: You led an effort to create the first standardized drug library for infusion pumps and then looked at the drug libraries in 100 hospitals. What did you find with regard to the standardization of concentrations, dosing units, etc?
Dr. Bates: The lack of standardization around those parameters was really unbelievable. There is enormous variability from place to place, and the result is a real safety risk. It was very common for hospitals to represent drugs with seven different types of dosing units and multiple concentrations. The net result is that if you’re using “ten” of something, that might be ten times too much, just the right amount or ten times too little. It’s not surprising that providers get confused, especially when they’re moving from place to place or traveling to the extent that happens today with nurses. So this whole issue of standardization represents an extremely important opportunity. Pharmacists and nurses undoubtedly had more insight into this, but it was certainly a big surprise to me at how big an issue this was.
Dr. Vanderveen: No matter what medication technology we discuss, there are two fairly consistent findings. First, some clinicians decide not to use the technology or to turn off some of the capabilities. Second, safety and decision support alerts may be disregarded or ignored. Workarounds such as laminated sheets with barcode labels and extra wrist bands are used with barcode systems. Smart pump alerts are frequently overridden when the current practice is not aligned with best practice. What can hospitals do to improve compliance and insure that these technologies, when implemented to prevent errors, are used to their fullest?
Dr. Bates: It is exceptionally important to pay attention to human factors and make it easy for providers to do the right thing. It’s clear, for example that if important alerts don’t look different from unimportant alerts, people will routinely ignore some of the important ones. You also have to track how often the warnings are going off, and educate providers about the importance of the alerts. When we starting working with the smart pumps, we found that nurses weren’t really convinced that the warnings we were giving them were important. We were studying this in units in which the nurses had a lot of experience and really did think that they knew better. So education can be useful. Some individuals override alerts on a regular basis. If you just see who is overriding a lot, track that and then have a conversation with them periodically. That can go a long way to improving compliance.
Dr. Vanderveen: Dr. Wachter, how should an organization plan to prevent unintended consequences of implementing safety measures? For example, picking the incorrect dose or drug after scrolling on CPOE, or scanning the wrong barcode labels that were attached to an IV bag, since caregivers are more likely not to question these electronic systems once the order is processed. Your thoughts?
Dr. Wachter: Again, it’s the importance of recognizing human factors and the role of culture. I think you’re hinting at something that is even deeper, culturally. Will all the technology make our providers more robotic and have their brains a little bit turned off? Will they just use the technology, assume it gets things right, and not stay vigilant? That’s tricky.
The hope is that the technology ultimately gets good enough that it is completely reliable for tasks that computers do better than humans, such as calculations. And that it builds in enough opportunity for human variation and for distinct situations, so that the providers continue to recognize those.
A major challenge is the way we educate young doctors, nurses, and pharmacists. How do we ensure that people retain the human aspect of being a healthcare professional and stay on their toes, while appropriately relying on the technology to do the things that it does uniquely well? I don’t think we have an answer to that, but it has to be on our agenda.
Dr. Bates: I would agree, and there is some empirical evidence that this phenomenon actually happens. For example, in radiation therapy they’ve gone very far to making things ultra safe. But the consequences are that now the machine is almost always right. It has been demonstrated empirically that providers are more likely not to question things when the machines say, here’s the dosage you should give and so on. It’s clear that the overall process of radiation therapy is vastly safer than it was before implementation of a lot of the technology, but this is a real challenge.
Dr. Wachter: We see it in diagnostic errors, as well. The issue is how do you train providers to keep in their brain the reflex that says, “This seems weird, I better double-check it and be sure it’s right”? Once we become completely reliant on the technology, we will assume it does a calculation correctly, and if it is suggesting a dose, that the dose is correct.
We really have to stay vigilant for training people to trust that instinct. For so many cognitive errors, when you do a root cause analysis, there were people in the process of delivering the care who were scratching their heads and saying, “Oh that’s funny, that’s kind of weird.” Then they said the most dangerous words in safety, “Oh, it must be right,” i.e., the doctor knows what he’s doing, the technology never makes a mistake. They went forward and did something when they sensed something that seemed a little bit off. Getting people to trust that instinct is probably the best we’re going to do. Many times where you see these glitches, somebody had a suspicion that something seemed a little off, yet they didn’t trust their instinct.
Dr. Vanderveen: How should organizations be monitoring medication errors and adverse drug events, and what are some of the challenges in doing this?
Dr. Bates: I think that organizations should have a portfolio. One thing that they should do is to collect the errors identified by pharmacists in the pharmacy. Tim Lazar, who is one of the pioneers in this area, demonstrated that you can do a lot of work improvement using that information.
Second, if you want to get at your administration error rate, the observation approach pioneered by Ken Barker and his group is a good one. You can go in and spot check periodically using that approach and find errors. To find adverse drug events, it’s relatively simple to use an approach like the Institute for Healthcare Improvement (IHI) trigger tool to review a set of admissions and find out how many adverse drug events you’re having.
I think before too long most hospitals will be using computer monitoring for adverse drug events, i.e., go through patient records and looks for signals that suggest that a patient might be having a problem. In many instances you can intervene and keep the problem from becoming a bigger one, or even head it off entirely. Work that we’ve done shows that, at least in our institution, that pays for itself. Now there are tools that will enable any hospital, even one without a sophisticated information system, to implement that kind of approach.
Dr. Vanderveen: Dr. Wachter, how should the new CMS “no pay” approach affect how hospitals behave?
Dr. Wachter: If you just look at the raw impact of the dollars at stake, it shouldn’t affect anything. CMS has estimated that the total decrease in payments to hospitals based on not paying for this list of ten adverse events will have a total financial impact to Medicare nationally of about $25 million. That amount of money falls out their pockets about every three minutes. So it is a trivial amount of money that is at stake here. This is actually a very clever policy intervention, because it’s leading to much, much more change than you would guess from the dollars at stake.
What should hospitals do? They need to get ready for this, because over time these initiatives will grow. We’re already seeing private insurers embracing the same methodology. The State of California is considering legislation that would expand the list of the ten adverse events on Medicare’s initial list to all 28 adverse events on the National Quality Forum (NQF) list. You have understand and improve your coding system very well to address this whole issue of defining adverse events as having been present on admission. Nobody has really done that very well.
To the extent that this drives our patient safety efforts, that is terrific. You now need to focus on preventing falls, preventing decubitus ulcers. So while the dollars at stake are small, but it is yet another in the drumbeat of policy activities that are designed to put skin in the game.
This is the first time that the payment system is being used to promote safety. This is different than pay-for-performance, which is largely quality oriented. Hospitals will have to get better at coding and better at prevention and better at understanding their processes to make sure that patients don’t have bad things happen to them in hospitals.
Dr. Vanderveen: Dr. Wachter, in your mind what’s the role of public reporting and transparency?
Dr. Wachter: At this point in the safety field, very little, because of how difficult it is to measure errors. This is changing rapidly. Once the NQF came up with its list of 28 “preventable adverse events,” it then became the scaffolding for public reporting of errors to the states. Now 27 states require that when you have a bad error on your watch, you have to report it to the state, and many of those states are making the data publicly available. So the same kind of pressure that hospitals already feel to improve their CMS core measures, which are mostly quality measures, they are beginning to feel on the safety side. To me, that is the biggest surprise in the last ten years—that on the quality side, public reporting works extraordinarily well.
Ten years ago I would have said that public reporting will work only if patients pay attention to the public reports and chose their doctors or hospitals based on them. Today there is no evidence that that is happening. But healthcare people are doing back flips and putting resources into trying to fix these problems. I think the motivation is mostly just shame and embarrassment. We don’t want to look bad, we want to look good and we want to be good. So public reporting does seem to be catalyzing significant changes, and we’re starting to see it being applied to safety as well as quality.
Dr. Vanderveen: Dr. Bates, what is the role of regulation with respect to technologies that may improve safety?
Dr. Bates: That’s a tricky one. I think that the best approach is to wait until adoption for a specific technology is in the 50 to 70% range or maybe even higher, depending on the rate of adoption, before mandating it. In general, I favor carrots as opposed to sticks, but sticks can be useful, too. For example, we’ve said that we will not have providers in our network anymore unless they’re willing to ePrescribe. A law was just passed in Massachusetts that will require all physicians in the state to be using an electronic record if they want to be licensed in the state by 2015. That will, I think, move the stakes. In Massachusetts the payers have all agreed to only give hospitals bonuses if they have implemented CPOE within the next several years.
So both regulation and incentives play a role. I think incentives should be used first, but there are going to be some laggards. If we want everybody to move, we do have to use some regulation, as well.
David Bates is chief of the Division of General Medicine at Brigham and Women’s Hospital, Boston, and one of the most prolific researchers in medication safety. He has been a leader in helping hospitals understand the nature and frequency of medication errors and the impact of various technologies that help to address error detection and prevention. Bates and his colleagues have published extensively on virtually every medication safety technology, including CPOE, smart pumps and barcode medication administration.
Robert Wachter is chief of the Division of Hospital Medicine and Medical Service at UCSF Medical Center, San Francisco. Wachter is credited with founding the Hospitalist specialty, the fastest growing specialty in medical history. He has published two books on patient safety, Internal Bleeding: The Truth Behind America’s Terrifying Epidemic of Medical Mistakes and Understanding Patient Safety, and writes Wachter’s World (http://www.wachtersworld.org/) one of healthcare’s most widely read blogs.
Tim Vanderveen is vice president of the Center for Safety and Clinical Excellence. He is responsible for ensuring Cardinal Health’s commitment to education and innovation to reduce variation in clinical practice, and to supporting hospitals’ patient safety initiatives. Prior to this position, Vanderveen was the director of clinical affairs, medication management systems, for ALARIS Medical Systems. He has been instrumental in the development of many of the innovations and safety and performance enhancements in drug infusion.
Bates, D. W., Leape, L. L., Cullen, D. J., Laird, N., Petersen, L. A., Teich, J. M., Burdick, E., et al. (1998, October 21). Effect of computerized physician order entry and a team intervention on prevention of serious medication errors. JAMA, 280(15), 1311-1316.
Del Beccaro, M. A., Jeffries, H. E., Eisenberg, M. A., & Harry, E. D. (2006, July). Computerized provider order entry implementation: no association with increased mortality rates in an intensive care unit. Pediatrics, 118(1), 290-295.
Gartner, Inc. Stamford, CT. http://www.gartner.com/ (accessed March 5, 2009).
Han, Y. Y., Carcillo, J. A., Venkataraman, S. T., Clark, R. S., Watson, R. S., Nguyen, T. C., Bayir, H., Orr, R. A. (2005, December). Unexpected increased mortality after implementation of a commercially sold computerized physician order entry system. Pediatrics, 116(6), 1506-1512.
Institute of Medicine. Committee on Quality of Health Care in America. (2000). To err is human: Building a safer health system. L.T. Kohn, J. M. Corrigan, & M. S. Donaldson (Eds.). Washington, DC: National Academy Press.
Koppel, R., Metlay, J. P., Cohen, A., Abaluck ,B., Localio, A. R., Kimmel, & S. E., Strom, B. L. (2005, March 9). Role of computerized physician order entry systems in facilitating medication errors. JAMA, 293(10), 1197–1203.
Metzger, J. B., Welebob, E., Turisco, F., & Classen, D. (2008, July/August). The Leapfrog Group’s CPOE standard and evaluation tool. Patient Safety & Quality Healthcare, 5(4), 22–25.
Metzger, J. B., Welebob, E., Turisco, F., & Classen, D. (2008, September/October). Effective use of medication-related decision support in CPOE. Patient Safety & Quality Healthcare, 5(5), 16–24.
Poon, E. G., Cina, J. L., Churchill, W., Patel, N., Featherstone, E., Rothschild, J. M., Keohane, C. A., et al. (2006, September). Medication dispensing errors and potential adverse drug events before and after implementing bar code technology in the pharmacy. Annals of Internal Medicine, 145(6), 426-434.
Rothschild, J. M., Keohane, C. A., Cook, E. F., Orav, E. J., Burdick, E., Thompson, S., Hayes, J., & Bates, D. W. (2005, March). A controlled trial of smart infusion pumps to improve medication safety in critically ill patients. Critical Care Medicine, 33(3), 533-540.