There are some additional references that supplement today's The Washington Post article regarding the changes in aviation and hospital safety since that day 25 years ago tomorrow when Air Florida flight 90 failed a slushy takeoff and crashed into the Potomac.
The article "A Crash's Improbable Impact" by Del Quentin Wilber describes the change from the rugged individual model of cockpits and operating rooms to a much safer model that emphasizes less ego and more listening to what staff are trying to say.
But some experts believe it took the spectacular crash of Air Florida in the Potomac to drill the lessons home and spur widespread use of what was then a revolutionary training regime, later to be known as Crew Resource Management.
Soon, airlines were teaching the Air Florida crash as a textbook example of what can go wrong when pilots do not communicate and listen properly.
Actually, Crew Resource Management is evolving into a science known as the University of Texas "Threat and Error Management", described in this British Medical Journal (BMJ) paper and this BMJ slide show that I refer to in prior posts analyzing the crash of Comair Flight 5191 in Kentucky last August.
PIlot John Nance's National Patient Safety Foundation should be mentioned as well, bringing aviation safety lessons into hospitals systematically. And University of Michigan Professor Karl Weick's work on organizational "mindfulness" is also required reading in how things go wrong and why. (One case of wildfire safety research is linked here and here.) Weick has written extensively about organizational "sensmaking under pressure".
Serious students of High Reliability Organizations have entire web sites with a deep literature on what actually works and what doesn't to make complex operations safer for us all, as well as some in depth white papers on the Web, such as MIT's John Carroll's ""Organizational Learning From Experience in High-Hazard Industries: Problem Investigation as Off-Line Reflective Practice" that explore what can go wrong and how to stop it. (I wrote a very short summary of that here.)
One very serious learning organization that deals with high-stakes life-and-death situations daily isn't hospitals, but the US Army. Those who don't believe that listening to subordinates (management's "theory Y") can work in practice and be compatible with maintenance of hierarchical control and mission control should read the US Army's Leadership Field Manual (FM 22-100) and see how they have merged the two successfully into a command and leadership doctrine.
Back on the hospital level, introduction of these techniques is now being taught in "Patient Safety" courses at schools such as Johns Hopkins by researchers such as Albert Wu, Laura Morlock, and Peter Pronovost - who won last year's JCAHO Eisenberg award for work in ICU safety. This is downstream work of the Institute of Medicine's "To Err is Human" study estimating almost 100,000 patients were dying each year from avoidable mistakes in hospitals, most of which arose from failures of communication.
One the reasons that many counselors are required for understanding and victory is that complex adaptive systems defy our intuition and often produce unexpected side-effects that humans tend to resist dealing with. This is clearly documented by Jay Forrester in his classic testimony to Congress referenced in my prior post here.
The trick in short is that commanders need to build trust in their men and women between high-speed missions if they expect their troops to listen and follow them blindly when it matters the most. The flip side is that very often the troops at the front are more in touch with reality than their commanders, and a pathway upwards for discordant or discrepant information has to be provided to update the mental model of what's going on.
The third reason for a whole-body learning organization is that sometimes the pilot in command simply loses it, but that shouldn't cost the mission. If you read the transcript to Air Florida flight 90's last minutes, the part not mentioned in the article, the pilot appears to have moved 60 seconds ahead in time and already be in the river: he simply repeats over and over "It's so cold" as the plane careens to its death.
technorati tags:safety, aviation, reliability, accident, mindfulness, Weick, Morlock, Pronovost, Forrester, Army, leadership
No comments:
Post a Comment