Quality Improvement: Quality Care Starts with Accurate Data


January / February 2007

Quality Improvement

Quality Care Starts with Accurate Data

How good are we? Recent news stories have illuminated the fact that less than one-third of patients suffering heart attacks get their blocked arteries opened within the recommended 90-minute timeframe. From those reports, it would appear that we are not doing well at all. That may be true, but my experience as an accreditation reviewer indicates that the collection of data within hospitals is inaccurate and inconsistent, making process evaluation across the industry impossible to assess.

To accurately assess our data and answer the question, “How good are we?” we must first ensure that our data is accurate. Accurate data is essential. The collection of accurate data is a process that must be organized and systematic. Accuracy implies “correctness.” Another aspect of data is precision. Precision implies “reproducibility.” Data must be correct (accurate) and reproducible (precise).

As our team travels from hospital to hospital, performing Chest Pain Center Accreditations (currently more than 300 hospitals), we find two consistent problems that prohibit facilities from accurately assessing the timeliness of care given to patients. First, the data points used in the collection method are inconsistent and unreliable. Second, hospitals are not using consistent operational definitions from which to collect their data. In many cases, patients and practitioners believe that the playing field is level, when in reality the information cannot be accurately compared because collection methods are uncontrolled and chaotic.

Accurate Data Points
What data points are needed to measure the timeliness of a process? The answer is simple: accurate and synchronized clocks! Think about your facility and ask yourself a few questions. Are the clocks in your facility synchronized? For example, if a patient presents to triage or registration and their arrival time is documented, is the computerized time synchronized with your ECG machine? Many times, we find the difference between the two ranges from 1 minute to as much as 60 minutes or more. Yet we find that facilities are using the time stamps from these separate machines and reporting door-to-ECG times into national databases. Is the data accurate? The answer is “no.”

Is your cardiac cath lab equipment synchronized with the emergency department time? We have seen facilities where these clocks differ by up to 30 minutes! Yet, the facilities are reporting door-to-PCI times into prominent and widely used databases.

The time collection method that interjects the most variability is allowing employees and patients to use their own wristwatches to document their arrival times. First, think of those patients and employees who round up or down when telling time. Instead of recording the exact time, they round down or up to the closest 5- to 15-minute time interval. How accurate is that data? Second, think of those patients who may record a time several minutes before they really arrived. They can claim they have been waiting longer in the emergency department lobby than they really have been. Finally, think of those patients and employees who routinely set their watches ahead so they will never be late to an appointment or allow their watches to run “a few minutes slow.”

Several celebrated national databases use data published on the Internet regarding hospital performance. As healthcare “best practices” develop, several defining quality markers are determined by the timeliness of a care process. Examples include door to ECG, door to fibrinolytic, and door to PCI. When our care is questioned by our colleagues and patients, shouldn’t we demand accuracy? We are told this material is being published so the public can make informed decisions regarding the quality of care. But how informed can their decisions be?

Consistent Operational Definitions
Inaccurate and inconsistent use of operational definitions used by organizations across the industry is another common problem that distorts reported data. An operational definition is a definition that all participants agree to use. An operational definition removes ambiguity and is essential to ensure accuracy and precision of data. The use of operational definitions is a fundamental concept that permits multiple data handlers to input data in a repetitive and predictable manner. Variation in the abstraction process is minimized.

For example, many mistakes start with the simple operational definition of “door time.” As we travel the country on our accreditation site visits, we see this “door time” applied to registration, triage, or even when the patient is placed in a bed. Theoretically these could all be correct. The most common mistake we encounter is capturing the “door time” when patients enter triage even when patients complete registration first. Although the patient may have spent 10 minutes or more in registration, the facility only captures the time when the patient enters triage. In essence, the facility will report a door-to-ECG time of 10 minutes, but in reality the door-to-ECG time should be 15 or 20 minutes depending on how much time the patient spent in the registration process. The public may think that a facility is doing better than their competitor, when in reality their competitor could be using the correct operational definition of “door time” and capturing the time that the patient first presents to the facility.

From a practical standpoint, facilities should use the following operational definition for door time: “The first instant the patient is physically within the ED (including the waiting area, registration, triage, etc.) regardless of mode of arrival.” That allows comparison across the industry, yet allows organizations to define what processes are most advantageous for their situation. For example, a facility could choose to triage the patient first or register the patient first. Either process would allow the collection of accurate data because they capture the patient’s first encounter with the facility personnel regardless of the initial event.

If you have variation regarding clock synchronization or the application of operational definitions, how do you really know how your organization (or our industry) is doing? Furthermore, how can we improve the quality of care if we are unsure if the data is correct?

A common model used in process improvement says that to solve a problem, we must define the problem, contain the problem, determine the root cause, implement a solutionverify if the new solution works.

How can we do any of these steps if we are not producing consistently reliable data? How can we define the problem if we do not know that we have a problem?

Finally, we must consider our integrity and credibility as an industry. Integrity and credibility should be a cornerstone of our industry. The public trusts us to care for their health and the health of their children, parents, and other extended family members. It is time that we demand accuracy in these data collection methods. From our findings, we believe that facilities are unintentionally skewing the data. They usually have not been given adequate training or instruction regarding data abstraction. Usually, they have not thought about the potential skewing of data that occurs because they don’t synchronize their clocks or use consistent operational definitions.

As nurses, doctors, and technicians, we must challenge ourselves to improve processes within our organizations. Recent news stories have highlighted problems that we must fix. But, if we are not accurately measuring the process, how can we improve it? Furthermore, if we are not able to measure it, how can we assess the care that is provided? That leads back to our original question, “How good are we?” The fact is, we don’t know.

Michael Callahan is an accreditation reviewer with AMC Registry, Inc., in Upper Arlington, Ohio. He works primarily with hospitals across the United States on chest pain center accreditation, with an emphasis on process improvement and adherence to clinical guidelines. His previous experience was as a staff nurse, clinical educator, and emergency room manager at Adena Health System in Chillicothe, Ohio. Callahan may be contacted at michael.callahan@amcinc.us.