Showing posts with label blindness. Show all posts
Showing posts with label blindness. Show all posts

Monday, October 20, 2008

The illusion of consensus on deficit

image: from http://www.moillusions.com/
The two vertical red bars are the same height on the screen if you measure them with a ruler. "All" you have to do is ignore the the subway walls and just look at the two red bars. (or get a ruler, or move to the side and look across the screen.) Some illusions are so powerful they work even when you know they are working.

========================

The cartoon figure Dennis the Menace once wondered "How come dumb stuff seems so smart when you're doing it?"

It's a very insightful question we should not rush by.

I think what's missing here the most is a popular understanding of the power of fear and desire to distort one's thinking.

There are three errors related to that most popular human activity, yielding to temptation.

The first is the incredible power of desire to overcome reason and twist perception so that the reasons for doing what you want to do anyway seem solid and strong, and the reasons against it seem distant and weak.

The second is the remarkable ability of people to be unaware of the difference between how things look from the inside and how they look from the outside. In the same breath as condemning home-buyers and banks for going way too far into debt, the same people turn and suggest with a straight face that the solution is "obviously" for the country to go much further into debt.

There is zero realization that the sin they accuse the bankers of looked exactly the same to the bankers as this "consensus" of going a few more trillion in debt looks to politicians today. And the actions that make so much sense today will look as unfathomable as the homebuyers and hedge-fund's actions look to us today.

"How could they have been so stupid?" It's worth understanding exactly how they could have been so stupid, and why very bright people end up doing very dumb things.

And the third is the remarkable power of group-think to solidify an opinion in a closed room and decide that those who have a different opinion are enemies of all that is right and decent, again obviously. And, as everyone knows, once everyone around you is sinning, it is much harder not to fall in line with them yourself, especially if you wanted to all along.

Again, rationality comes up behind, making up and changing justifications on the fly to make the choice look sane and rational and even fair and balanced.

Prompt for this post was the following
==========================================


The New York Times hs an article this morning

Deficit Rises, and Consensus is to let it Grow
Louis Uchitelle and Robert Pear
Excerpt:
Like water rushing over a river’s banks, the federal government’s rapidly mounting expenses are overwhelming the federal budget and increasing an already swollen deficit.

and

But the extra spending, a sore point in normal times, has been widely accepted on both sides of the political aisle as necessary to salvage the banking system and avert another Great Depression.

“Right now would not be the time to balance the budget,” said Maya MacGuineas, president of the Committee for a Responsible Federal Budget, a bipartisan Washington group that normally pushes the opposite message.

Confronted with a hugely expensive economic crisis, Democratic and Republican lawmakers alike have elected to pay the bill mainly by borrowing money rather than cutting spending or raising taxes.
First, I noted that the vast majority of the comments on this article were very negative, so, like the bailout itself, it seems the consensus in Washington flies in the face of the concensus on Main Street.

I did comment myself, as follows:
The cartoon figure Dennis the Menace once wondered "How come dumb stuff seems so smart when you're doing it?"

Teenagers with their first credit card, families with their first great deal on a mortgage, hedge funds and even conservative banks with their soaring debt, all are so swayed by the temptation that they forget the bills will come due some day.

Regardless of the consensus on the issue, I would suggest that letting the debt out of the bag is less "river water over the banks" and more "water over-topping the earthen levee". God help us all.
and, later,

If more debt is acceptable, why not just borrow $3 trillion and give everyone $10,000?

I think that's a reasonable alternative to compare any other borrow-and-spend scheme to for pros and cons.

Friday, August 31, 2007

Model-induced blindness, FEMA, and Systemantics

[ Published in my other weblog 8/31/06. Still relevant today ]

It's a year since Katrina made it obvious that people watching CNN knew more about what was going on top government officials.

We have to ask how that is even possible. It defies our intuition, although not our experience, which is interesting.

While the "blame-game" remains in high-gear, Systems Thinking leads us to discount the obvious "bad people" and look for deeper root-causes in the social structure. FEMA Director Brown has been replaced, but the systems problems are harder to see and may still be there.

How would we know?

Some systems features come with the territory, such as problems getting coherent action across 6 or more layers of a hierarchical structure. Each layer has its own intrinsic variables and a world view that is quite distinct from that of the layers above or below. The result is that communication across levels that appears easy is actually quite hard, although the miscommunications may be hard to detect locally. The same words play into different mental models of the world, and convey different meanings.

This is not a problem that is fixed by simply getting everyone radios with compatible frequencies. A discussion in depth of this problem can be found on the weblog Fifteen Charlie.

Or, on the health care front, this type of problem is not resolved by everyone agreeing to use messages all formatted to the same governmental standard, such as HL7, so they are "interoperable." The telephone was already "compatible" in that manner, but it didn't help New Orleans. To change the outcomes, we need to realize that there are no "technical problems", only socio-technical problems, and the "socio-" part cannot be a last-minute add-on optional feature.

So, from President Bush's point of view, policies were followed, money was launched, their work is done. "Brownie, you're doing a heck of a job..." Six to ten levels away, where the money or benefits or even rescue from rooftops was not underway, these actions looked feeble and inept, disconnected from reality. And today, a year later, much of New Orleans still remains as it was a year ago, although many of the fund have now been fully expended.

John Gall, a University of Michigan emeritus physician, in his marvelous book Systemantics (1986) , captures the essence, as he calls it of "How systems really work and how they fail." An introduction to this book can be found here on wikipedia, and some of the key rules revealed, such as "A system is no better than its sensory organs" and "To those within a system, outside reality tends to pale and disappear." He goes on to describe the inversion of "input" and "output" and gives this example:

"A Giant program to Conquer Cancer is begun. At the end of five years, cancer has not been conquered, but one thousand research papers have been published. In addition, one million copies of a pamphlet entitled "you and the War against Cancer" have been distributed. Those publications will absolutely be regarded as Output rather than Input. "

His book is a real gem, an easy read, and worth re-reading at least once a month.

Meanwhile, New Orleans remains a visible and tragic reminder of what an open-loop, top-down control model produces in practice. Without sensory feedback making the return journey from the eye to the brain, the hand is as likely to end up in the flame as on the handle of the frying pan. I'd says this "cybernetics 101" property of a control loop is what Steven Covey in Seven Habits of Highly Effective People calls a principle, a law of nature, that you can like or dislike, but you can't get around.

Unfortunately for all of us, it is precisely when high-stress disasters occur that these upwards communication channels close up entirely, as New Orleans discovered. Almost every factor there is conspires to close the lines:

* Top brass, fearing blame, close ranks

* Top brass, under stress, fall back on previously successful behaviors of ignoring small stuff and focusing on the top one or two priority issues. In a huge, multilevel organization, this means every problem from level 3 down is totally ignored.

* The most important information, that which challenges preconceived notions and the assumptions of the plan in hand, is what is ignored the most at the top. They are trying to focus on working the plan, not questioning the plan. Efforts to challenge facts are viewed as enemy action, not as helpful feedback from sensory organs. In worst cases, the messengers are killed to resolve the conflict between inputs and mental model.

I'm working on a white-paper on the issue of how upwards channels shut down during disasters and how that could influence disaster preparedness competencies. Contact me if you're interested in reviewing it.

And, voila. A president who is unaware of what every CNN viewer knows. Auto companies that can't understand how anyone could have foreseen rising gasoline prices, or competition from China.

These are very strong systems forces, that can totally overwhelm huge numbers of very bright and well intentioned people. These are the types of problems we need to be able to recognize and solve, or they will simply keep on occuring.

How frequent are such problems? Well, if problems occur randomly at all levels, and if humans typically only ever see and fix the non-systems problems, then there will be an ever growing sludge of unattended system problems. The percentage of all problems that are systems problems will keep on growing. A good guess, perhaps somewhat waggish, is that, if such problems have never been addressed, then they almost certainly dominate current behavior of the organization in question. The longer the organization has been functioning, and the larger it is, the larger the percentage of problems will be unrealized and unresolved systems problems. The US government is probably almost a limit point, and probably over 99% of it is dominated by such problems, as the others have all been fixed.

The fact that we don't recognize these as problems is what Systems Thinking attempts to address. Our "systems problem" detectors are broken is what it is.

Take the analogy to the detection of pulsars, intensely bright flashing objects in the radio frequency spectrum, virtual strobe lights in the night sky to a radio telescope, outshined only by the sun and the galactic center. These were missed entirely for years, because "everyone knew" that there were no important signals at high-frequencies, that this was just noise, and the noise was filtered out before doing any analysis of the sensory input.

It took a female graduate student, Jocelyn Bell Burnell, one not caught up in the shared myth, to challenge that assumption, remove the filter, and just look at what was there with open eyes.

Systems problems are similar. They are everywhere around us, but we use statistics all based on Sir R. A. Fisher's work and the General Linear Model, or even multilevel models that are still linear, (as Dr. Ana Diez-Roux at the University of Michigan points out in the Annual Reviews of Public Health, 2000, Volume 21, pages 171-192.), and that assume, at their core, that there is no feedback. The key assumption, generally unspoken, and often unrealized, is that there is a causal end of "independent variables", some set of paths, and a terminal end of "dependent variables." Feedback or reciprocal causality is often noted in passing, but, lacking a recognized way to cope with it, most public health papers then try to proceed without it. Or, since the feedback is "small", it is considered insignificant - a mistake similar to looking at a beaker of air and denying the possibility of the existence of "hurricanes", which rely critically on such tiny effects, almost infinitely compounded, to exist and grow.

So, without meteorological feedback effects, Katrina would never have existed in the first place.
It invalidates the model if the output feeds back into the input, with feedback, which, of course, almost every social system we care about does: love, war, communication, relationships, terrorism ("He hit me back first!"), the economy, the stock market, the housing market, etc.

So, we don't see such "distal causality", not because it's not there, but because we've short-circuited it out of the equations before we even turn on the computer.

I'd suggest it's time for someone to remove that filter, analyze how to do statistics on feedback-dominated regulatory control loops, and let us see what's really out there. Odds are, as with the night sky, we will be very surprised by the answer.


========

In his new book "The Eighth Habit - From Effectiveness to Greatness", Steven Covey separates out and focuses on problems, including organizational blindness, that result from attempting to use the old paradigm, the industrial machine model, instead of the new paradigm - the Knowledge Worker model.

In the old model, workers are treated like machines - replaceable, better without an independent mind or spirit, needing firm management or a good whip hand to keep them from goofing off. In particular, only those in positions of authority should take initiative and decide what should be done. The model creates a self-fulfilling world.

The alternative he presents is the empowered knowledge worker, who has initiative, a "voice", heart, spirit, and an active role including taking personal responsibility for seeing that the job gets done, and done well. This expectation also creates a self-fulfilling world that latches, but in a far more productive state, and one that requires far less day to day management of "bad employees".