Understanding of Human Over-Reliance on Technology

Design the technology to reduce over-reliance. The design of the technology can affect the users’ attention and how they regard its value and reliability. For example, the “auto-complete” function for drug names after entering the first few letters is a design strategy that has often led to selection of the first, but incorrect, choice provided by the technology. Requiring the use of 4 letters that generates a list of potential drug names could reduce these types of errors. To cite another example, studies have found that providing too much on-screen detail can decrease the user’s attention and care, thereby increasing automation bias (Goddard, Roudsari, & Wyatt, 2012).

Provide training. Provide training about the technology involved in the medication-use system to all staff who utilize the technology. Include information about the limitations of such technology, as well as previously identified gaps and opportunities for error. Allow trainees to experience automation failures during the training (e.g., technology failure to issue an important alert; discrepancies between technology entries and handwritten entries in which the handwritten entries are correct; “auto-fill” or “auto-correct” errors; incorrect calculation of body surface area due to human error during input of the weight in pounds instead of kg). Experiencing technology failures during training can help to reduce errors due to complacency and automation bias by encouraging critical thinking when using automated systems (Goddard, Roudsari, & Wyatt, 2012). Allowing trainees to experience automation failures may increase the likelihood of recognizing these failures during daily work.

Reduce task distraction. Although easier said than done, leaders should attempt to ensure those using technology can do so uninterrupted and are not simultaneously responsible for other tasks. Automation failures are less likely to be identified if the user is required to multitask or is otherwise distracted or rushed (Parasuraman & Manzey, 2010).

Conclusion

Technology plays an important role in the design and improvement of medication systems; however, it must be viewed as supplementary to clinical judgement. Although its use can make many aspects of the medication-use system safer, healthcare professionals must continue to apply their clinical knowledge and critical thinking skills to use and monitor technology to provide optimal patient care.


ISMP thanks ISMP Canada for its generous contribution to the content for this article.

This column was prepared by the Institute for Safe Medication Practices (ISMP), an independent, charitable nonprofit organization dedicated entirely to medication error prevention and safe medication use. Any reports described in this column were received through the ISMP Medication Errors Reporting Program. Errors, close calls, or hazardous conditions may be reported online at www.ismp.org or by calling 800-FAIL-SAFE (800-324-5723). ISMP is a federally certified patient safety organization (PSO), providing legal protection and confidentiality for patient safety data and error reports it receives. Visit www.ismp.org for more information on ISMP’s medication safety newsletters and other risk reduction tools. This article appeared originally in the September 8, 2016, issue of the ISMP Medication Safety Alert!

 

References

Campbell, E. M., Sittig, D. F., Guappone, K. P., Dykstra, R. H., & Ash, J. S. (2007). Overdependence on technology: An unintended adverse consequence of computerized provider order entry. AMIA Annu Symp Proc, 2007, 94–98.

Coiera, E. (2015). Technology, cognition and error. BMJ Qual Saf, 24(7), 417–422.

Goddard, K., Roudsari, A., & Wyatt, J. C. (2012). Automation bias: A systematic review of frequency, effect mediators, and mitigators.
J Am Med Inform Assoc, 19(1), 121–127.

Goddard, K., Roudsari, A., & Wyatt, J. C. (2014). Automation bias: Empirical results assessing influencing factors. Int J Med Inform, 83(5), 368–375.

ISMP Canada. (2016). Understanding human over-reliance on technology. ISMP Canada Safety Bulletin, 16(5), 1–4.

Lee, J. D., & Moray, N. (1992). Trust, control strategies and allocation of function in human-machine systems. Ergonomics, 35(10), 1243–1270.

Mahoney, C. D., Berard-Collins, C. M., Coleman, R., Amaral, J. F., & Cotter, C. M. (2007). Effects of an integrated clinical information system on medication safety in a multi-hospital setting. Am J Health Syst Pharm, 64(18), 1969–1977.

Mosier, K. L., & Skitka, L. J. (1996). Human decision makers and automated decision aid: Made for each other? In R. Parasuraman & M. Mouloua (Eds.), Automation and human performance: Theory and applications (pp. 201–220). Mahwah, NJ: Lawrence Erlbaum Associates, Inc.

Parasuraman, R., & Manzey, D. H. (2010). Complacency and bias in human use of automation: An attentional integration. Hum Factors, 52(3), 381–410.

Parasuraman, R., Molloy, R., & Singh, I. L. (1993). Performance consequences of automation-induced “complacency.” Int J Aviat Psychol, 3(1), 1–23.

Yeh, M., & Wickens, C. D. (2001). Display signaling in augmented reality: Effects of cue reliability and image realism on attention allocation and trust calibration. Hum Factors, 43(3), 355–365.