Because "systems thinking" is a difficult concept to describe, I wrote and just posted a paper analyzing a commercial aircraft disaster - the crash of Comair 5191 - in Lexington Kentucky, August, 2006. This is a full-length (30 page) analysis with pictures and diagrams and source materials, aside from the cockpit voice recorder transcripts, which are linked below. The final NTSB findings on the case are not yet out, to my knowledge.
It's a little rough around the edges, but it starts with the basic astounded question of how two, fully trained pilots, not under pressure, could taxi to and attempt to take off from the wrong runway, resulting in the death of all on-board except the one who was flying the plane, who was pulled from the flaming wreckage by a first responder. The runway was a few hundred yards too short for the plane to have made it off the ground safely.
So, it goes from "How on earth could this have happened!?" to "Oh... There but for the grace of God go I." Only the new commercial pilots on the pilot chat blogs couldn't imagine how such a thing could ever happen to them. It brought to mind the old saying "There are bold pilots, and there are old pilots." In this case, however, the rest of the world conspired to set the stage.
As with "errors" in hospitals, it typically takes a whole team of people to align their actions in the wrong way (the "swiss cheese model"), for someone to buy the gun, someone to load the gun, someone to cock the hammer, someone to hand it to the poor last guy in the chain, and that guy to pull the trigger. For legal purposes, blame is assessed one way, a way this paper does not assess. For purposes of safety engineering, and seeing where interventions might help to avoid ever having this happen again, we need to look at a whole different set of factors that set the stage for this "accident".
Please contact me if you'd like to use this paper (or a newer, better version) for instructional material. Thanks!
( Note: I am a private pilot, but I'm not a member of the NTSB or any official agency, and this analysis is a personal analysis for instructional purposes in safety engineering, not intended for legal purposes. I have no relationship that I know of to anyone involved in this case. These are all real, living people and my reconstruction may be entirely wrong. The point is to honor those who died by learning everything we can from their deaths so this won't happen again.)
Prior Posts:
Comair 5191 - Confirmation Bias and Framing (1/20/07)
Cockpit voice recorder transcripts
Washington DC Crash of Air Florida was 25 years ago - remembered
(with links to BMJ, High-reliability engineering, TEM, etc.)
1 comment:
This Time Aviation Safety Needs to Learn a Lesson From Healthcare Safety
Congratulations to Wade for putting together his root cause analysis that puts the Comair crash into so much better a perspective from a systems view than does the NTSB report.
For years, the patient safety movement has held up aviation as an example of how a high-risk industry can improve safety. Aviation’s confidential incident reporting system led to a rich database of “near-misses” and actual accidents that are analyzed and used to prevent accidents. Concepts of team training and simulation techniques that many healthcare organizations use today come from cockpit resource management (CRM) pioneered in aviation. Use of checklists, alerts and reminders, “hearback”, and even the “timeout” we now utilize to ensure correct site surgery all have roots in aviation safety. Transportation accident investigations have taught us how to look for root causes and contributing factors in medical incidents.
However, this time aviation safety needs to learn a lesson from healthcare safety.
The NTSB analysis of the crash was very light on the systems issues that contributed to this unfortunate occurrence and its recommendations were few, though at least one NTSB board member added her own concerns about more systemic issues.
As in most serious incidents, in healthcare or aviation or other industry, the confluence of several conditions and events and a cascade of several errors bypassed the numerous defenses built into the system to prevent an accident. The NTSB recommendations address many of those issues and may be important in preventing future similar accidents. But one cannot help but wonder that they left out the most significant solution. The bottom line in this case: the short runway is still there. This was not the first time a plane has taken off on the wrong runway nor likely to be the last.
In safety systems we assume that many things may go wrong even if they are statistically unlikely to go wrong. The best solutions are forcing functions (systems that force one to do the correct thing) or constraints - ones that prevent someone from doing the wrong thing. In healthcare (where we've unfortunately had to learn from accidents), we now use special connectors that prevent oxygen lines from being inadvertently hooked up to other gas lines, or connectors that prevent a feeding tube from being inadvertently hooked up to an IV line. We also remove the vials of potentially fatal medications from floor stock so a nurse or doctor cannot inadvertently administer a fatal dosage under stressful circumstances.
Even if fully barricading the entrance to that short runway is not feasible, there must be ways to make it physically impossible for a big airplane to get on a runway intended only for small airplanes. The commercial planes that need the long runway are undoubtedly bigger, heavier, taller, and have wider wingspans. One can certainly think of a number of physical constraints that would allow the small general aviation planes access to the short runway, yet preclude a large commercial airliner from getting on that wrong runway. Just as we try to avoid putting a doctor or nurse and their patients in one of these dangerous situations, why would we ever put pilots and their passengers in a situation where they could mistakenly end up on the wrong runway? Hopefully, aviation safety can use a lesson from healthcare this time.
Bradley T. Truax, M.D.
Lewiston, NY
The Truax Group
Healthcare Consulting
www.patientsafetysolutions.com
Post a Comment