Safety Requires a Healthy Preoccupation With Failure

Publication
Article
Pharmacy TimesJune 2019 Women's Health
Volume 85
Issue 6

Health care organizations are taking a page out of the books of high-reliability organizations.

To make health care safer, many health care organizations are attempting to adopt the characteristics of high-reliability organizations (HROs) that have achieved impressive safety records despite operating in unforgiving environments. Examples of HROs include aircraft carriers and nuclear power plants.

HROs consistently navigate complex, dynamic, and time-pressured conditions in a nearly error-free manner.1,2 They achieve their exceptional performances through a collective behavioral capacity that enables them to detect and correct errors and adapt to unexpected events, despite a changing environment.3-6

RELIABILITY IN HROS

HROs think that variability in practices, in the form of moment-to-moment adaptations and timely adjustments, is exactly what improves reliability.7 To deal with unexpected events, HROs are alert to the possibility of errors and agree that it is necessary to detect, understand, and recover from unexpected events before they cause harm.1,3,8 These cognitive processes are driven by a chronic, deep sense of unease that arises from admitting the possibility of failure even with stable, well-designed procedures in place.1,3

PREOCCUPATION WITH FAILURE

A chronic worry about system failure is a distinctive attribute of HROs.1-7,9,10 People in HROs are naturally suspicious of “quiet periods” and reluctant to engage in activities that are not sensitive to the possibility of error.1 They ask, “What happens when the system fails?” not, “What happens if the system fails?”4 Workers in an HRO possess an intelligent wariness about their work and an enhanced sense of risk awareness and wisdom about errors.7 They have moved from a mindset of “no harm, no foul” to searching out and reviewing close calls or near failures to address areas of potential risk to prevent adverse events.9

This preoccupation with failure runs counter to various human cognitive biases.11 For example, a normalcy bias makes it difficult for us to engage in “worst-case” thinking and plan for a serious disaster or failure. This bias causes us to assume that although a catastrophic event has happened to others, it will not happen to me. Other challenges that make it difficult to maintain a preoccupation with failure include an optimism bias, which leads to overestimation of favorable outcomes, and the ostrich effect, which is the tendency for people to avoid unpleasant information.

HROs encourage and reward error and near-miss reporting. They clearly recognize that the value of remaining fully informed about safety is far greater than any perceived benefit from disciplinary actions. Landau and Chisholm5 emphasized this point more than 2 decades ago when describing a seaman on a Navy nuclear aircraft carrier who broke a vital rule: He did not keep track of all his tools while working on the landing deck. The seaman subsequently found 1 of his tools missing and immediately reported it. All aircraft en route to the carrier were redirected to other land bases until the tool was found. The next day, the seaman was commended for his disclosure during a formal ceremony.

HROs pay close attention to near misses and can clearly see how close they come to a full-blown disasters. Less-safe organizations consider close calls to be evidence of their ability to avoid a disaster.1 HROs work on the assumption that what seems to be an isolated event is likely caused by a confluence of numerous upstream errors.7 Less-safe organizations also tend to localize failures (eg, the problem is in the specific pharmacy, so changes are needed only in that pharmacy). HROs generalize even small failures and consider them a lens to uncover weaknesses in other vulnerable parts of the system.1,3

Michael J. Gaunt, PharmD, is a medication safety analyst and the editor of ISMP Medication Safety Alert! Community/Ambulatory Care Edition.

References

  • Weick KE, Sutcliffe KM, Obstfeld D. Organizing for high reliability: processes of collective mindfulness. Research in Organizational Behavior. 1999;21:81-123.
  • Vogus TJ, Rothman NB, Sutcliffe KM. et al. The affective foundations of high-reliability organizations. J Organiz Behav. 2014;35(4):592-6.
  • Weick KE, Sutcliffe KM. Managing the Unexpected: Assuring High Performance in an Age of Complexity. San Francisco: Jossey-Bass; 2001.
  • Leonard M, Frankel A, Simmonds T, et al. Achieving Safe and Reliable Healthcare: Strategies and Solutions. Chicago: Health Administration Press; 2004.
  • Vogus TJ, Welbourne TM. Structuring for high reliability: HR practices and mindful processes in reliability-seeking organizations. J Organiz Behav. 2003;24(7):877—903.
  • Weick KE, Sutcliffe KM. Managing the Unexpected: Resilient Performance in an Age of Uncertainty, 2nd ed. San Francisco, CA: Jossey-Bass; 2007.
  • Reason J. Individual and collective mindfulness. In: Reason J, ed. The Human Contribution. Burlington, VT: Ashgate Publishing Company; 2010:239-263.
  • Blatt R, Christianson MK, Sutcliffe KM, et al. A sensemaking lens on reliability. J Organiz Behav. 2006;27(7):897-917.
  • Christianson MK, Sutcliffe KM, Miller MA, Iwashyna TJ. Becoming a high reliability organization. Crit Care. 2011; 15(6):314. doi: 10.1186/cc10360.
  • Reason J. Managing the Risks of Organizational Accidents. Burlington, VT: Ashgate Publishing Company; 2000.
  • Dvorsky G. The 12 cognitive biases that prevent you from being rational. Gizmodo. January 9, 2013. io9.gizmodo.com/the-12-cognitive-biases-that-prevent-you-from-being-rat-5974468. Accessed May 10, 2019.

Related Videos
Practice Pearl #1 Active Surveillance vs Treatment in Patients with NETs
© 2024 MJH Life Sciences

All rights reserved.