Publication
Article
Pharmacy Times
Author(s):
Health care organizations are taking a page out of the books of high-reliability organizations.
To make health care safer, many health care organizations are attempting to adopt the characteristics of high-reliability organizations (HROs) that have achieved impressive safety records despite operating in unforgiving environments. Examples of HROs include aircraft carriers and nuclear power plants.
HROs consistently navigate complex, dynamic, and time-pressured conditions in a nearly error-free manner.1,2 They achieve their exceptional performances through a collective behavioral capacity that enables them to detect and correct errors and adapt to unexpected events, despite a changing environment.3-6
RELIABILITY IN HROS
HROs think that variability in practices, in the form of moment-to-moment adaptations and timely adjustments, is exactly what improves reliability.7 To deal with unexpected events, HROs are alert to the possibility of errors and agree that it is necessary to detect, understand, and recover from unexpected events before they cause harm.1,3,8 These cognitive processes are driven by a chronic, deep sense of unease that arises from admitting the possibility of failure even with stable, well-designed procedures in place.1,3
PREOCCUPATION WITH FAILURE
A chronic worry about system failure is a distinctive attribute of HROs.1-7,9,10 People in HROs are naturally suspicious of “quiet periods” and reluctant to engage in activities that are not sensitive to the possibility of error.1 They ask, “What happens when the system fails?” not, “What happens if the system fails?”4 Workers in an HRO possess an intelligent wariness about their work and an enhanced sense of risk awareness and wisdom about errors.7 They have moved from a mindset of “no harm, no foul” to searching out and reviewing close calls or near failures to address areas of potential risk to prevent adverse events.9
This preoccupation with failure runs counter to various human cognitive biases.11 For example, a normalcy bias makes it difficult for us to engage in “worst-case” thinking and plan for a serious disaster or failure. This bias causes us to assume that although a catastrophic event has happened to others, it will not happen to me. Other challenges that make it difficult to maintain a preoccupation with failure include an optimism bias, which leads to overestimation of favorable outcomes, and the ostrich effect, which is the tendency for people to avoid unpleasant information.
HROs encourage and reward error and near-miss reporting. They clearly recognize that the value of remaining fully informed about safety is far greater than any perceived benefit from disciplinary actions. Landau and Chisholm5 emphasized this point more than 2 decades ago when describing a seaman on a Navy nuclear aircraft carrier who broke a vital rule: He did not keep track of all his tools while working on the landing deck. The seaman subsequently found 1 of his tools missing and immediately reported it. All aircraft en route to the carrier were redirected to other land bases until the tool was found. The next day, the seaman was commended for his disclosure during a formal ceremony.
HROs pay close attention to near misses and can clearly see how close they come to a full-blown disasters. Less-safe organizations consider close calls to be evidence of their ability to avoid a disaster.1 HROs work on the assumption that what seems to be an isolated event is likely caused by a confluence of numerous upstream errors.7 Less-safe organizations also tend to localize failures (eg, the problem is in the specific pharmacy, so changes are needed only in that pharmacy). HROs generalize even small failures and consider them a lens to uncover weaknesses in other vulnerable parts of the system.1,3
Michael J. Gaunt, PharmD, is a medication safety analyst and the editor of ISMP Medication Safety Alert! Community/Ambulatory Care Edition.
References