The human element plays a pivotal role in adapting to new technology.
By Dennis Tribble, FASHP, PharmD
It has been has been over 10 years since the Institute of Medicine published its landmark report “To Err is Human
,” yet recent reviews of error rates have shown no dramatic reduction. Indeed, it would appear that the error rate landscape remains largely unchanged.
Technology presents a number of attractive opportunities to reduce the incidence of error. Why then has the deployment of this technology not produced some of the striking reductions in error rates that we expected from this “silver bullet”?
One explanation involves the human element of technology deployment, literally “aiming the silver bullet”. When the deployment of the technology fails to accommodate the requisite human factors, the technology fails. These factors include a number of human attributes that may confound the deployment of the technology and they include:
Preference for avoiding change – Human beings equate habit with competence, and actively avoid taking on new ways of working, even when the new way offers opportunities for better outcomes. Pre-automation work habits are built around the lack of that automation. The addition of technology to a given work practice should therefore result in that practice changing. That change creates discomfort, the feeling of incompetence. Until new habits are formed, the provider feels less productive. When stressed, we will naturally fall back to old, habitual ways of solving problems. Moreover, if our sense of self-worth is tied to our mastery of the current system (especially as compared to others with less mastery), we will actively subvert the new system because it places us at the same level as everyone else.
Failure to understand or believe the reason for the technology – Another human bias is to maintain the illusion of control, even when control does not actually exist. In health care, this takes the form of denial… “Errors happen to someone else… not me…”; “I have been a nurse for 35 years and have never made an error!” When we don’t believe a problem exists, we see no reason to endure the discomfort of learning a new way of working. Rather, we stoically endure the imposition of the technology and find workarounds to force fit it into our old way of working. In many cases, each provider must go through the epiphany that they can and do make errors themselves (the average intelligent person will make 3 mistakes for every 100 actions), and that the technology, appropriately used, will help avoid those errors. That takes time and it may require breaking down old conceptions about work and competence.
Failure to implement the technology with sufficient support and infrastructure – Human beings excel at filtering out and ignoring “background noise” (warnings that turn out to be inconsequential). Consider this example: If my oil light comes on in my car and my mechanic tells me that it was good I had things checked out, I will continue to pay attention to the oil light in my car. On the other hand, if it consistently gives me warning with no cause, I will teach myself to ignore it. Similarly, warning systems—such as bar-code medication administration systems—that have a high rate of false warnings eventually train users to presume that a system problem led to the illumination of warning lights and not human error.
Implementing technology must take into account the notion that it will require some effort and hand-holding to help providers accept and learn new habits around that technology. Key up-front tasks should include:
Dennis Tribble, FASHP, PharmD, is an expert on health-system pharmacy operations, patient safety and related medication safety issues. A pharmacist and software engineer, he is passionate about the need for a complete restructuring of the pharmacy practice paradigm and the role technology will play in bringing about that vision. A fellow of the American Society of Health-System Pharmacists (ASHP), member and past chairman of the ASHP Section on Pharmacy Informatics and Technology (SOPIT), and a charter member of the Pharmacy Informatics Task Force for the Healthcare Information and Management Systems Society (HIMSS), Dennis can be reached at email@example.com.
Describing the endpoints that the technology is expected to achieve. If those endpoints include improved safety, then some discussion of normal human error rates is important as part of the effort to get providers to realize, and admit, that human error is part of everyone’s life, and that the proper use of this tool will help identify those errors before they cause harm.
Building consensus that adoption of the new technology will require all the development of new habits around tasks that have been done “the old way” for a long time. While this will likely feel uncomfortable until the new habits are formed, the technology’s success hinges upon this. Using experts in the current system to help develop and implement the new system may help preserve their sense of seniority and help the transition to the new system.
Ensuring that the technology is implemented with sufficient infrastructure to succeed. Users who cannot tell the difference between a system that finds a lot of errors and a system that just doesn’t work will presume quickly that the system just doesn’t work.
The human element will always play a pivotal role in the practice of medicine. Organizations that consider—and plan for this—will be better positioned to aim the silver bullet for success.