Understanding of Human Over-Reliance on Technology

On the patient care unit, the order for Dilantin had been correctly transcribed by hand onto a daily computer-generated medication administration record (MAR), which was verified against the prescriber’s order and cosigned by a nurse. The nurse who obtained the medication from the unit’s ADC noticed the discrepancy between the MAR and the ADC display, but accepted the information displayed on the ADC screen as correct. The patient received one dose of long-acting dilTIAZem 300 mg orally instead of the Dilantin 300 mg as ordered. The error was caught the next morning when the patient exhibited significant hypotension and bradycardia.

Automation bias and automation complacency 

The tendency to favor or give greater credence to information from technology (e.g., an ADC display) and to ignore a manual source of information that provides contradictory information (e.g., a handwritten entry on the computer-generated MAR), even if it is correct, illustrates the phenomenon of automation bias (Goddard, Roudsari, & Wyatt, 2012). Automation complacency is a closely linked, overlapping concept that refers to the monitoring of technology less frequently or with less vigilance because of a lower degree of suspicion of error and a stronger belief in its accuracy (Parasuraman & Manzey, 2010). End-users of a technology (e.g., a nurse that relies on the ADC display that lists medications to be administered) tend to forget or ignore that information from the device may depend on data entered by a person. In other words, processes that may appear to be wholly automated are often dependent upon human input at critical points and thus require the same degree of monitoring and attention as manual processes. These two phenomena can affect decision-making in individuals as well as in teams and offset the benefits of technology (Parasuraman & Manzey, 2010).

Automation bias and complacency can lead to decisions that are not based on a thorough analysis of all available information but that are strongly biased toward the presumed accuracy of the technology (Parasuraman & Manzey, 2010). While these effects are inconsequential if the technology is correct, errors are possible if the technology output is misleading. An automation bias omission error takes place when users rely on the technology to inform them of a problem but it does not (e.g., excessive dose warning); thus, they fail to respond to a potentially critical situation because they were not prompted to do so. An automation bias commission error occurs when users make choices based on incorrect suggestions or information provided by technology (Goddard, Roudsari, & Wyatt, 2012). In the Dilantin incident described above, there were two errors caused by automation bias: the first error was when the pharmacy staff member accepted dilTIAZem as the correct drug in the pharmacy order entry system. The second error occurred when the nurse identified the discrepancy between the ADC display and the MAR but trusted the information on the ADC display over that on the handwritten entry on the computer-generated MAR.

In recent analyses of health-related studies on automation bias and complacency, clinicians overrode their own correct decisions in favor of erroneous advice from technology between 6% to 11% of the time (Goddard, Roudsari, & Wyatt, 2012), and the risk of an incorrect decision increased by 26% if the technology output was in error (Goddard, Roudsari, & Wyatt, 2014). The rate of detecting technology failures is also low—in one study, half of all users failed to detect any of the technology failures introduced during the course of a typical work day (e.g., important alert did not fire, presentation of the wrong information or recommendation) (Parasuraman & Manzey, 2010; Parasuraman, Molloy, & Singh, 1993).