Inappropriate trust in technology: implications for critical care nurses
- 8 February 2011
- journal article
- review article
- Published by Wiley in Nursing in Critical Care
- Vol. 16 (2), 92-98
- https://doi.org/10.1111/j.1478-5153.2010.00407.x
Abstract
To explore evidence from the literature that critical care nurses may have inappropriate levels of trust in the technological equipment they use and the implications of this for patient safety. Nurses in intensive care units are required to observe the operation of an array of complex equipment. Failure of this equipment can have potentially fatal consequences for the patient. Research from other settings, such as the work of airline pilots, suggests that experienced operators of highly reliable automation may display inappropriately high levels of trust in the automation and this can lead to inadequate monitoring of the equipment by the operator. Inadequate monitoring means that the operator may fail to notice that the equipment is not functioning correctly which may have serious consequences. An initial search was made of a number of databases including Academic Search Premier, CINAHL, Pubmed and ScienceDirect. Extensive use was also made of citations found in articles uncovered by this initial search. Evidence suggests that there is potential for critical care nurses to display complacent attitudes. In addition, there are a number of reasons why the consequences of this complacency are not as visible as in other settings. If nurses are not aware of the potential and consequences of inappropriate trust, there is a real possibility that patients may suffer harm because of it. There is an urgent need for more research to identify direct evidence of complacency and its consequences. There is also a need for these issues to be highlighted in the training of intensive care nurses and there are implications for intensive care unit practice protocols and equipment manufacturers.Keywords
This publication has 16 references indexed in Scilit:
- Humans: Still Vital After All These Years of AutomationHuman Factors: The Journal of the Human Factors and Ergonomics Society, 2008
- Automation-induced complacency for monitoring highly reliable systems: the role of task complexity, system experience, and operator trustTheoretical Issues in Ergonomics Science, 2007
- Disclosing errors and adverse events in the intensive care unit*Critical Care Medicine, 2006
- Trust in Automation: Designing for Appropriate RelianceHuman Factors: The Journal of the Human Factors and Ergonomics Society, 2004
- Human performance and embedded intelligent technology in safety-critical systemsInternational Journal of Human-Computer Studies, 2003
- “To err is human”: uniformly reporting medical errors and near misses, a naïve, costly, and misdirected goalJournal of the American College of Surgeons, 2003
- A look into the nature and causes of human errors in the intensive care unitCritical Care Medicine, 1995
- Performance Consequences of Automation-Induced 'Complacency'The International Journal of Aviation Psychology, 1993
- The impact of machines on the work of critical care nursesCritical Care Nursing Quarterly, 1990
- Trust between humans and machines, and the design of decision aidsInternational Journal of Man-Machine Studies, 1987