(Columbia shuttle launch. / NASA )
The difficulty lies, not in the new ideas, but in escaping from the old ones, which ramify, for those brought up as most of us have been, into every corner of our minds. John Maynard Keynes
To understand how we “see things”, we need to realize that vision is not at all some kind of biological TV camera that simply projects its image where “we can view it carefully and without bias. The picture that forms has been so filtered, edited, and amended as to sometimes bear little relationship at all to what is before us. Our hopes, fears, mental models, stereotypes and prejudices intervene long before the image delivered to us has been formed – as surely as a political candidate’s own words have been replaced by many layers of handlers. And, worse, the intervention is itself as invisible to us, and hard to see, as our eyes’ own “blind spots” – which are effectively papered over with an extrapolation of the surroundings so that we are not burdened (or informed) by what is there.
In our evolution it was valuable to be able to discard the ten thousand leaves and, based solely on a little patch showing through here and there, to connect the dots and so perceive the dangerous animal behind them, and to do so with sufficient certainty that we would take immediate defensive action, even if sometimes over-reacting to shadows. The process is built into our hardware and is automatic and invisible. The process is accelerated if everyone else around us is screaming and running – we too see the beast, real or not.
Two features of our visual system contribute greatly to disagreements between humans to what is “obviously going on”.
One feature is a type of automatic “zoom” feature, which brings whatever we are contemplating at whatever scale to just fill our mental TV screen. Whether it is tying our shoe-lace, or contemplating global thermonuclear war, the subject occupies exactly one mental screen.
A second feature, adopted from our need to survive, is the way our eyes cause anything that is constant to fade from view, literally, so that we are able to detect quickly anything that is moving or changing or different.
These two features combine to make it startlingly easy to take some small disagreement between two people and have each person “blow it all out of proportion” and lose track entirely of how much in common they have, and all the good things they share. After cooling down, each wonders how that could possibly have occurred. This is a perfect example of a problem actually caused by the “features” of our visual system.
Another problem is the astounding impact of context on how “the exact same data” is seen on our mental TV screen.
Here’s one example, in which you should simply ignore the background and note that the two vertical red bars are exactly the same height. It is extremely hard to do, even after you print out the image and measure them and confirm it.
Below is an even stronger illusion.
The dark gray square at the top was made by simply cutting out a section of the "light" gray square in the "shadow", and pasting it up in the white background area.
Your eyes "auto-correct" it for you to account for the "shadow." You can’t stop them from doing this. I have yet to find anyone who can easily “see” that the two squares marked are the same shade of gray, even when they have confirmed that they are.
I know this seems hard to believe, so do this" print out the picture, get a pair of scissors, and cut out the square in the shadow and slide it over to the edge, where it magically "changes color" and becomes dark. As you slide over "the "shadow", the same square changes shade right in front of you.
This is just one of the thousands of things your perceptual system is doing to be "helpful" to you, including altering the way you perceive people around you, so that they fit your mental model of how things "should" be.
The same effect is at work if you're deep into depression, when your mind is "helpfully" coloring everything around you "depressing" before it shows it to you.
That's what makes prejudice or bias or depression so hard to detect and treat - they seem so "obvious" and "external" that you can't figure out that your eyes changed reality before they showed it to you. This realization that your mind can lie, convincingly, to you, is the first step in Cognitive Behavioral Therapy and overcoming depression.
So, our minds and eyes can be gripped with not just an image, but an attitude or mental model that is almost alive, that filters and twists and selects and changes everything around us to fit its own view and thereby survive. It fights back against our inroads, undoing our progress. No wonder earlier humans thought they had become “possessed” by a demon.
This, sadly, is not just something that occurred to ancient man, and we, being modern, are no longer subject to. These are the same bodies and visual systems that ancient man had, with all the pros and cons.
In modern terms, we are captive to mental models and feedback loops. The famous economist John Maynard Keynes, observed the same thing here (quoted in http://en.wikiquote.org/wiki/John_Maynard_Keynes )
The General Theory of Employment, Interest and Money (1935)
- The difficulty lies, not in the new ideas, but in escaping from the old ones, which ramify, for those brought up as most of us have been, into every corner of our minds.
- Preface
- The ideas of economists and political philosophers, both when they are right and when they are wrong, are more powerful than is commonly understood. Indeed the world is ruled by little else. Practical men, who believe themselves to be quite exempt from any intellectual influence, are usually the slaves of some defunct economist.
- Ch. 24 "Concluding Notes"
Sadly, we have not even exhausted the features of human perception that control us invisibly, intervening before we can see what they have done.
Charles Schultz’s cartoon character Snoopy, lying atop his dog house one night, captured it perfectly as he mused:
Did you ever notice
That if you think about something at 2 AM
And then again at noon the next day
You get two different answers?
An equivalent morsel of wisdom from “Dennis the Menace” cartoon is this thought, as Dennis is in the corner being punished for his later mischief:
“How come dumb stuff seems so smart when you’re doing it?”
For better or worse, we are all caught up in an invisible current created by those around and near us, especially our peers. The resulting “group think” can often lead us all to the same wrong conclusion at once, and then sort of latch that thought in where none of us can escape “seeing it” as “obvious”.
This might not be so bad, but if we simultaneously interpret those who disagree as “enemies, out to destroy us”, we have a serious problem.
In any case, as we have all experienced, it is far easier to fall into mischief or sin or wrong ideas if the entire herd around us has already fallen into it.
This impact is remarkably strong, and well known to magicians. If only one person in an audience sees through your trick but no one else near them sees it, they will tend, strongly, to actually “un-see” what they “thought they saw” to reduce the discord.
Because all these effects take place before the images reach your mental TV screen, you can try all you want to be “unbiased” after that, with no impact. And usually, if charged with being biased or prejudiced, people react with anger and outrage, because they are trying to “be careful.” Sadly, they are carefully reasoning with distorted information.
One professor I had in Business School was involved in the design of the Pentagon’s War Room. He noted that, by the time the billions of pieces of information had been processed, filtered, summarized, tweaked, and massaged to make them fit in a one page summary, the conclusion was already built in by the system. Anyone would make the same conclusion, wrong or right, viewing that information. The War Room or central headquarters concept has a fatal flaw that way. How, for example, could General Motors executives not realize that people would switch to smaller cars when their financial pain rose? From the outside, it seems incredible.
Corporations and large organizations have a worse problem, that so far no one besides me seems to have noticed: What small facts or “dots” add up to, how they connect, depends on what scale you are operating on, not just on where you stand.
Here’s one of the classic pictures that illustrate the problem. View this image from normal viewing range, and then stand up, walk across the room, turn and look again.
The image above is from the 31 March 2007 issue of New Scientist and it is from a paper entitled 'Hybrid Images'
http://www.yoism.org/?q=node/141 has many more such images and illusions, as well as this delightful picture:Things we have to believe to see
Why men don't ask for directions
Pisa/OECD - Why our education stresses the wrong way of seeing
Failure is perhaps our most taboo subject (link to John Gall Systemantics)
Active strength through emergent synthesis
US - Economy of arrogance (and blindness)
Virtue drives the bottom line - secrets of high-reliability systemsHigh-Relability Organizations and asking for help
Secrets of High-Reliability Organizations (in depth, academic paper)
High-Reliability.org web site
Threat and Error Management - aviation and hospital safety
Failure is perhaps our most taboo subject (link to John Gall Systemantics)
Houston - we have another problem (on complexity and limits of one person's mind)
Institute of Medicine - Crossing the Quality Chasm and microsystems (small group teamwork)
Here's a few quotations from MIT Professor John Sterman's textbook "Business Dynamics".Many advocate the development of systems thinking - the ability to see the world as a complex system, in which we understand that "you can't just do one thing" and that "everything is connected to everything else." (p4)
Such learning is difficult and rare because a variety of structural impediments thwart the feedback processes required for learning to be successful. (p5)
Quoting Lewis Thomas (1974):When you are confronted by any complex social system, such as an urban center or a hamster, with things about it that you're dissatisfied with and anxious to fix, you cannot just step in and set about fixing things with much hope of helping. This realization is one of the sore discouragements of our century.... You cannot meddle with one part of a complex system from the outside without the almost certain risk of setting off disastrous events that you hadn't counted on in other, remote parts. If you want to fix something you are first obligated to understand ... the whole system ... Intervening is a way of causing trouble.
IN reality there are no side effects, there are just effects.
Unanticipated side effects arise because we too often act as if cause and effect were always closely linked in time and space. (p 11)
Most of us do not appreciate the ubiquity and invisibility of mental models, instead believing naively that our senses reveal the world as it is (p16).
The development of systems thinking is a double-loop learning process in which we replace a reductionist, narrow, short-run static view of the world with a holistic, broad, long-term dynamic view and then redesign our processes and institutions accordingly. (p18)
Quoting Nobel Prize winner Herbert Simon (p26) : The capacity of the human mind for formulating and solving complex problems is very small compared with the size of the problem...
These studies led me to suggest that the observed dysfunction in dynamically complex settings arises from mis-perceptions of feedback. The mental models people use to guide their decisions are dynamically deficient. As discussed above, people generally adopt an event-based, open-loop view of causality, ignore feedback processes, fail to appreciate time delays between action and response in the reporting of information, ... (p27)
Further the experiments show the mis-perception of feedback are robust to experience, financial incentives, and the presence of market institutions... First our cognitive maps of the causal structure of systems are vastly simplified compared to the complexity of the systems themselves. Second, we are unable to infer correctly the dynamics of all but the simplest causal maps. (p27)
People tend to think in single-strand causal series and had difficulty in systems with side effects and multiple causal pathways (much less feedback loops.) (p28).
A fundamental principle of system dynamics states that the structure of the system gives rise to its behavior. However, people have a strong tendency to ... "blame the person rather than the system". We ... lose sight of how the structure of the system shaped our choices ... [which] diverts our attention from ... points where redesigning the system or governing policy can have a significant, sustained, beneficial effect on performance (Forrester 1969.). p29.
People cannot simulate mentally even the simplest possible feedback system, the first order linear positive feedback loop. (p29). Using more data points or graphing the data did not help, and mathematical training did not improve performance. ([p29). People suffer from overconfidence ... wishful thinking ... and the illusion of control... Memory is distorted by hindsight, the availability and salience of examples, and the desirability of outcomes.
The research convincingly shows that scientists and professionals, not only "ordinary" people, suffer from many of these judgmental biases. (p30). Experiments show the tendency to seek confirmation is robust in the face of training in logic, mathematics, and statistics. (p31).
We avoid publicly testing our hypotheses and beliefs and avoid threatening issues. Above all, defensive behavior involves covering up the defensiveness and making these issues undiscussable, even when all parties are aware they exist. (p32).
Defensive routines often yield group-think where members of a group mutually reinforce their current beliefs, suppress dissent, and seal themselves off from those with different views or possible disconfirming evidence. Defensive routines ensure that the mental models of team members remain ill formed, ambiguous, and hidden. Thus learning by groups can suffer even beyond the impediments to individual learning. (p33).
Virtual worlds are the only practical way to experience catastrophe in advance of the real thing. In an afternoon, one can gain years of simulated experience. (p35).
The use of virtual worlds in managerial tasks, where the simulation compresses into minutes or hours dynamics extending over years or decades, is more recent and less widely adopted. Yet these are precisely the settings where ... the stakes are highest. (p35).
Without the discipline and constraint imposed by the rigorous testing imposed by simulation, it becomes all too easy for mental models to be driven by ideology or unconscious bias. (p37).
System dynamics was designed specifically to overcome these limitations. ... As Wolstenholme (1990) argues, qualitative systems tools should be made widely available so that those with limited mathematical background can benefit from them. (p38).
Most important ... simulation becomes the main, and perhaps the only way you can discover for yourself how complex systems work. (38).
On High Reliablity organizations, which are sobering. They try really really hard to not have accidents, and still don't succeed from time to time: