Archive for October, 2012
It’s Not Automatically OK
It’s complicated. Changes are flying at us at an accelerating rate, clashing and clanging against what we thought was certain. Legislative measures that may or may not prevail, changes in how performance is evaluated, changes in reimbursement, changes in the roles of all involved – seemingly the only thing for certain is change. What will be the opposite and equal reaction?
Let’s take stock of the current situation. The anticipated shortfall of primary care physicians if the Affordable Care Act (ACA) remains in force is estimated to balloon to more than 60,000 by 2025. Whether or not the ACA remains in force, the active patient population will continue to grow as baby boomers age. This burgeoning patient population is adding to the strain caused by increasing multimorbidities and chronic conditions. Medicine, like aviation, is going to face greater demands and an increasing need for more experts.
This change in the environment will require changes in behavior. Parts 1 and 2 of this series explored how medicine learned from aviation’s work on changing behavior. It started with the checklist, moved into Cockpit Resource Management, and then evolved into Crew Resource Management. Aviation had made huge strides in recognizing that, in order to optimize resources in the work environment, it was essential for all crew members to be actively involved. With Crew Resource Management’s team approach widely implemented and on its way to being mastered, the fiercely competitive aviation industry (manufacturers and airlines) scanned the environment for a concept to leverage to lower training costs, improve safety and deal with the shrinking pool of already trained pilots from the military. And so they were seduced by the siren song of automation.
While automation has improved costs and safety, commercial aviation is now in the midst of the painful process of recognizing and adjusting to the unintended pitfalls of automation reliance. In a fascinating recording of an American Airlines pilot training session, Children of the Magenta, over-reliance on automation is the key concept. The trainer repeatedly demonstrates how pilots rely on automation even when they can clearly use manual controls. He cautions that “in 68% of accidents, automation dependency plays a critical part in leading crews… to allow their aircraft to get much closer to the [edge of the envelope] than they should have.” In other words, their reliance on automation has caused them to neglect their critical thinking.
Understanding why automation can lead to failure is important. One interesting study of automation’s surprises contained in Handbook of Human Factors & Ergonomics points out that the implicit promise of automation technology is that it increases precision and efficiencies while reducing the potential for human error. However, that promise can come up short because the human-machine interaction cannot replicate the “basic competencies” of human-human interaction. Humans have difficulty remaining actively engaged while monitoring – the place automation puts us. We disengage and lose situational awareness.
Moreover, automated systems tend to “switch off” in complex situations where they may be needed most. The person monitoring may not be sufficiently oriented to the situation to handle it effectively. Even worse, key bits of information can be masked by well meaning efforts to simplify matters for the human monitoring the process. One widely studied accident occurred in part because key information needed by the crew was masked.
Similarly, in medicine, technological advances have created opportunities to reduce the likelihood of error through systems design and processes. As the number and variety of ways to automate performance increases, so does the reliance on those mechanisms. Physician shortages and finite resources coupled with increasing demand create a pressure to do more with less. Automation may again seem to be the silver bullet. Aviation has learned that automation can help but it comes with its own pitfalls. Health care can avoid these potential pitfalls by learning from aviation’s experience.
So how are we in health care being seduced by automation? It’s easy to see how enticing high-tech diagnostic imaging and robotic surgery can be. Here are some more discrete real-life examples:
- Physicians increasingly request decision support to help them with clinical implementation. An example of this is clinical indicators by condition.
- Electronic medical records (EMR) are being designed so that the physician cannot digress from the protocol embedded in the EMR without documenting the reason.
- Workstations are being set up and work flows designed so that the clinician has a limited set of options for actions and activities.
Again, automation in itself is not inherently bad. It is when it is used so widely and extensively that critical thinking falls to the wayside that issues start to arise. Here are some examples of automation’s unintended consequences:
- Electronic dosing schedules that do not include maximum dosages can result in overdosing if there is an over-reliance on the system.
- When processes are developed to reduce error and ensure the clinician “does the right thing,” opportunities for critical thinking are being designed out of the process.
- Over-focus on the processes may cause the obvious to be missed. One patient shared her recent experience with a hospital room not being tended as it should. It escaped the floor nurse’s attention although the nurse’s station was in close proximity to the room because there was no “check room cleanliness” prompt on her computer screen.
Aviation and health care’s shared draw to automation and the potential for negative unintended consequences has not escaped notice of those straddling both industries. One such individual, a clinician from Sydney, pointed this out on a pilot’s forum. He commented, “You know, this is a broader cultural issue. We are seeing the exact same paradox in medicine. Everything is being reduced to autopilot with clinical pathways and guidelines. This approach inevitably de-emphasizes critical thinking, and clinical decision making skills are being lost as a result.”
The impact of a loss of focus on critical decision-making skills now may have a large impact in the future. The commenter continues by pointing out that medicine is like an apprentice system where students and junior physicians model their future practices after what they observe. They are likely to practice what they learn.
He points out that while the benefits of automation may be immediate, the true cost may not be realized for years, and cautions, “This is like aviation. All the airlines have followed this path for years – in part due to regulatory requirements, but also due to significant economic benefits. But people are now starting to question the consequences.”
The risk for error in both health care and aviation increases when the automation is highly complex or inconsistent. Additionally, the visual and tactile cues that can increase situational awareness are often casualties of automation design process. For example, debates have continued to wage for years about the value of a flight yoke versus a side stick because of the related loss of visual and tactile clues provided by the yoke to all the crew in the cockpit. Similarly, there are scores of unintended consequences from each design choice and philosophy. Perhaps one salient lesson is to design these cues in to the process instead of out.
Health care has benefited from incorporating aviation’s quality and safety improvement tactics into our industry, but often only after the long passage of time, many lost lives and wasted resources. We have an opportunity to head off the worst of automation’s unintended consequences by designing automation in a way that fosters critical thinking skills.