It's a year since Katrina made it obvious that people watching CNN knew more about what was going on top government officials.
We have to ask how that is even possible. It defies our intuition, although not our experience, which is interesting.
While the "blame-game" remains in high-gear, Systems Thinking leads us to discount the obvious "bad people" and look for deeper root-causes in the social structure. FEMA Director Brown has been replaced, but the systems problems are harder to see and may still be there.
How would we know?
Some systems features come with the territory, such as problems getting coherent action across 6 or more layers of a hierarchical structure. Each layer has its own intrinsic variables and a world view that is quite distinct from that of the layers above or below. The result is that communication across levels that appears easy is actually quite hard, although the miscommunications may be hard to detect locally. The same words play into different mental models of the world, and convey different meanings.
This is not a problem that is fixed by simply getting everyone radios with compatible frequencies. A discussion in depth of this problem can be found on the weblog Fifteen Charlie.
Or, on the health care front, this type of problem is not resolved by everyone agreeing to use messages all formatted to the same governmental standard, such as HL7, so they are "interoperable." The telephone was already "compatible" in that manner, but it didn't help New Orleans. To change the outcomes, we need to realize that there are no "technical problems", only socio-technical problems, and the "socio-" part cannot be a last-minute add-on optional feature.
So, from President Bush's point of view, policies were followed, money was launched, their work is done. "Brownie, you're doing a heck of a job..." Six to ten levels away, where the money or benefits or even rescue from rooftops was not underway, these actions looked feeble and inept, disconnected from reality. And today, a year later, much of New Orleans still remains as it was a year ago, although many of the fund have now been fully expended.
John Gall, a University of Michigan emeritus physician, in his marvelous book Systemantics (1986) , captures the essence, as he calls it of "How systems really work and how they fail." An introduction to this book can be found here on wikipedia, and some of the key rules revealed, such as "A system is no better than its sensory organs" and "To those within a system, outside reality tends to pale and disappear." He goes on to describe the inversion of "input" and "output" and gives this example:
"A Giant program to Conquer Cancer is begun. At the end of five years, cancer has not been conquered, but one thousand research papers have been published. In addition, one million copies of a pamphlet entitled "you and the War against Cancer" have been distributed. Those publications will absolutely be regarded as Output rather than Input. "
His book is a real gem, an easy read, and worth re-reading at least once a month.
Meanwhile, New Orleans remains a visible and tragic reminder of what an open-loop, top-down control model produces in practice. Without sensory feedback making the return journey from the eye to the brain, the hand is as likely to end up in the flame as on the handle of the frying pan. I'd says this "cybernetics 101" property of a control loop is what Steven Covey in Seven Habits of Highly Effective People calls a principle, a law of nature, that you can like or dislike, but you can't get around.
Unfortunately for all of us, it is precisely when high-stress disasters occur that these upwards communication channels close up entirely, as New Orleans discovered. Almost every factor there is conspires to close the lines:
* Top brass, fearing blame, close ranks
* Top brass, under stress, fall back on previously successful behaviors of ignoring small stuff and focusing on the top one or two priority issues. In a huge, multilevel organization, this means every problem from level 3 down is totally ignored.
* The most important information, that which challenges preconceived notions and the assumptions of the plan in hand, is what is ignored the most at the top. They are trying to focus on working the plan, not questioning the plan. Efforts to challenge facts are viewed as enemy action, not as helpful feedback from sensory organs. In worst cases, the messengers are killed to resolve the conflict between inputs and mental model.
I'm working on a white-paper on the issue of how upwards channels shut down during disasters and how that could influence disaster preparedness competencies. Contact me if you're interested in reviewing it.
And, voila. A president who is unaware of what every CNN viewer knows. Auto companies that can't understand how anyone could have foreseen rising gasoline prices, or competition from China.
These are very strong systems forces, that can totally overwhelm huge numbers of very bright and well intentioned people. These are the types of problems we need to be able to recognize and solve, or they will simply keep on occuring.
How frequent are such problems? Well, if problems occur randomly at all levels, and if humans typically only ever see and fix the non-systems problems, then there will be an ever growing sludge of unattended system problems. The percentage of all problems that are systems problems will keep on growing. A good guess, perhaps somewhat waggish, is that, if such problems have never been addressed, then they almost certainly dominate current behavior of the organization in question. The longer the organization has been functioning, and the larger it is, the larger the percentage of problems will be unrealized and unresolved systems problems. The US government is probably almost a limit point, and probably over 99% of it is dominated by such problems, as the others have all been fixed.
The fact that we don't recognize these as problems is what Systems Thinking attempts to address. Our "systems problem" detectors are broken is what it is.
Take the analogy to the detection of pulsars, intensely bright flashing objects in the radio frequency spectrum, virtual strobe lights in the night sky to a radio telescope, outshined only by the sun and the galactic center. These were missed entirely for years, because "everyone knew" that there were no important signals at high-frequencies, that this was just noise, and the noise was filtered out before doing any analysis of the sensory input.
It took a female graduate student, Jocelyn Bell Burnell, one not caught up in the shared myth, to challenge that assumption, remove the filter, and just look at what was there with open eyes.
Systems problems are similar. They are everywhere around us, but we use statistics all based on Sir R. A. Fisher's work and the General Linear Model, or even multilevel models that are still linear, (as Dr. Ana Diez-Roux at the University of Michigan points out in the Annual Reviews of Public Health, 2000, Volume 21, pages 171-192.), and that assume, at their core, that there is no feedback. The key assumption, generally unspoken, and often unrealized, is that there is a causal end of "independent variables", some set of paths, and a terminal end of "dependent variables." Feedback or reciprocal causality is often noted in passing, but, lacking a recognized way to cope with it, most public health papers then try to proceed without it. Or, since the feedback is "small", it is considered insignificant - a mistake similar to looking at a beaker of air and denying the possibility of the existence of "hurricanes", which rely critically on such tiny effects, almost infinitely compounded, to exist and grow.
It invalidates the model if the output feeds back into the input, with feedback, which, of course, almost every social system we care about does: love, war, communication, relationships, terrorism ("He hit me back first!"), the economy, the stock market, the housing market, etc.
So, we don't see such "distal causality", not because it's not there, but because we've short-circuited it out of the equations before we even turn on the computer.
I'd suggest it's time for someone to remove that filter, analyze how to do statistics on feedback-dominated regulatory control loops, and let us see what's really out there. Odds are, as with the night sky, we will be very surprised by the answer.
========
In his new book "The Eighth Habit - From Effectiveness to Greatness", Steven Covey separates out and focuses on problems, including organizational blindness, that result from attempting to use the old paradigm, the industrial machine model, instead of the new paradigm - the Knowledge Worker model.
In the old model, workers are treated like machines - replaceable, better without an independent mind or spirit, needing firm management or a good whip hand to keep them from goofing off. In particular, only those in positions of authority should take initiative and decide what should be done. The model creates a self-fulfilling world.
The alternative he presents is the empowered knowledge worker, who has initiative, a "voice", heart, spirit, and an active role including taking personal responsibility for seeing that the job gets done, and done well. This expectation also creates a self-fulfilling world that latches, but in a far more productive state, and one that requires far less day to day management of "bad employees".
No comments:
Post a Comment