Understanding of Human Over-Reliance on Technology

Causes of automation bias and complacency

Automation bias and complacency are thought to result from three basic human factors (Parasuraman & Manzey, 2010; Goddard, Roudsari, & Wyatt, 2012):

In human decision-making, people have a tendency to select the pathway requiring the least cognitive effort, which often results in letting technology dictate the path. This factor is likely to play a greater role as people are faced with more complex tasks, multitasking, heavier workloads, or increasing time pressures—common phenomena in healthcare.

People often believe that the analytic capability of technology is superior to that of humans, which may lead to overestimating the performance of these technologies.

People may reduce their effort or shed responsibility in carrying out a task when an automated system is also performing the same function. It has been suggested that the use of technology convinces the human mind to hand over tasks and associated responsibilities to the automated system (Coiera, 2015; Mosier & Skitka, 1996). This mental handover can reduce the vigilance that the person would demonstrate if carrying out the particular task independently.

Other conditions linked to automation bias and complacency are discussed below.

Experience. There is conflicting evidence as to the effect of experience on automation bias and complacency. While there is evidence that reliance on technology is reduced as experience and confidence in one’s own decisions increases, it has also been shown that increased familiarity with technology can lead to desensitization, which may cause clinicians to doubt their instincts and accept inaccurate technology-derived information (Goddard, Roudsari, & Wyatt, 2012). Thus, automation bias and complacency have been found in both naïve and expert users (Parasuraman & Manzey, 2010).

Perceived reliability and trust in the technology. While once believed to be a general tendency to trust all technology, today, automation bias and complacency are believed to be influenced by the perceived reliability of a specific technology based on the user’s prior experiences with the system (Parasuraman & Manzey, 2010). When automation is perceived to be reliable at least 70% of the time, people are less likely to question its accuracy (Campbell, Sittig, Guappone, Dykstra, & Ash, 2007).

Confidence in decisions. As trust in technology increases automation bias and complacency, users are less likely to be biased if they are confident in their own decisions (Goddard, Roudsari, & Wyatt, 2012; Lee & Moray, 1992; Yeh & Wickens, 2001).

Safe practice recommendations 

The use of technology is considered a high-leverage strategy to optimize clinical decision-making—but only if the users’ trust in the technology closely matches the reliability of the technology itself. Therefore, the following strategies to address errors related to automation bias and complacency focus on:

  • Improving the reliability of the technology itself
  • Supporting clinicians to more accurately assess the reliability of the technology, so that appropriate monitoring and verification strategies can be employed

Analyze and address vulnerabilities. Conduct a proactive risk assessment (e.g., failure mode and effects analysis [FMEA]) for new technologies to identify unanticipated vulnerabilities and address them before undertaking facility-wide implementation. Also encourage reporting of technology-associated risks, issues, and errors.

Limit human-computer interfaces. Organizations should continue to enable all technology to communicate seamlessly, thereby limiting the need for human interaction with the technology, which could introduce errors.