Why do people disagree about what they see?
If they disagree does it mean that at least one of them is wrong?
The short answer is that even very competent trained observers are often wrong about what they see, and, in many cases, even if they perform flawlessly, it turns out they are wrong.
And, very importantly, these differences in perception, combined with a lack of understanding of the causes of them, have led to many arguments, battles, and probably entire literal wars.
And, has been so well captured in Thomas Kuhn - The Structure of Scientific Revolutions, totally new paradigms are often strongly ridiculed and not "seen", let alone "obvious" to the vast majority of competent scholars early in the lifetime of the idea.
And, from long experience with the insidious impact of unconscious bias, even among very highly trained researchers, the standard in medicine is a "double blind" study, to get around the well known tendency to see what we expect to see, or to see what we want to see - or not to see things that would be very painful if seen. In The Mismeasure of Man, Stephen Jay Gould
describes extensive case studies of how things scientists were sure were true about humans have proven to be patently false.
Human vision is a very tricky thing, swayed by peer-group pressure, swayed by biases from fears and desires and prior experience and conditioning, subject to numerous kinds of reproducible errors at best. Yet, it is good enough for us to get by, mostly.
I do not share the view of those who swing the pendulum to the far side, and therefore claim that
"nothing can be known" or that anyone's opinion is as good as anyone else's, "because it's all relative and subjective" anyway. Even Einstein, so often misquoted, showed that with the corrections and adjustments he proposed, the seemingly conflicting views of two observers could be in fact perfectly reconciled.
We do not know and cannot see "everything", and at the same time, we do know and can see "something" -- the truth is in that inconvenient middle ground.
Some improvements can be made by multiple observers working together. Again there are traps and pitfalls, such as "Group Think", or tyranny of the majority, and suppression of minority or inconvenient views, but the truth is neither pure white nor pure black.
Because humans heavily use their visual processing circuitry to do "thinking", and attempt to "see" things meaning "understand them", the failure modes of this circuitry are of interest to us all. Further, humans tend to put strong judgmental values on top of what they perceive as "true" or "obvious" - so social action often follows from such broken vision.
Furthermore, groups of humans tend to influence each-other's perceptions in invisible ways, so that the entire collection of them, starting perhaps with one undecided state, can break into "camps" each seeing something different more and more as time evolves. Differences from other "camps" are viewed as some kind of enemy action, further suppressing rational discussion. Once very widely held, these mistaken views latch in and lock down, and b ecome shared norms and prejudices, filtering further information gathering to only support themselves and there becoming self-supporting, and, in fact, attacking efforts to dislodge the majority error.
Most of this, it turns out, is really a function of perception, and is as true of robots trying to "see" something as it is of humans.
What do we make then of a person who "see things" in a way different from our own? Or a person who sees entire things we don't even see at all, and don't believe are there? Are they to be considered "crazy" or "heretics"?
The human visual system has both feedback and feed-forward, and learns over time. As a consequence, things that we really need to see, we can often learn to see easily. A botanist, glancing at a plant, or a professional IT person, glancing at software, can often see a huge amount that a layperson cannot see at all, even when instructed where to look. We don't have to train our eyes, they figure out we need to see stuff and show it to us dynamically better with time.
But this system in humans is linked into the pleasure seeking and pain avoiding systems as well. Some views or perceptions are essentially certain to lead to painful memories, or to actions that will produce pain and have produced pain in the past, and our visual systems happily and automatically reshuffle neurons so that the painful perceptions and conflict is reduced. Often, the systems reduce pain to zero by making certain perceptions simply impossible for us to see any more. If we're not testing for this, we can easily miss the effect.
So, for example, if a person feels extremely insecure and vulnerable, their perceptual system may figure out that most complexity or harshness causes intolerable pain, and may decide, on its own, to shut off perception of complexity or threats. The person has been moved, by their visual system, into a very simplified world where nothing is complex and everything is safe, despite what others see. At the extreme, we classify such people as "mentally ill", but most of the intermediate states -- inability to cope with complexity -- simply turn into types of "political" viewpoints where simplistic, black and white views of the world are desperately clung to and become self-righteously self-reinforcing by blocking out all contrary information and often blocking out all those who disagree with the conclusions as being "enemies".
There are many counter-measures we can take, such as pooling notes, or all agreeing that we should be polite to those who disagree with us or espouse contrary opinions, and hear them out. The trend these days in the USA seems to be to shout them down, not to hear them out, however.
On a personal or team or department or national level, providing a forum for different "sides" and hearing out each other, where possible can avoid many errors we would later regret. This too, the essence of civilized discourse, is out of vogue in the US, where even allowing the impression of uncertainty is considered a sign of "weakness" and inappropriate in a "leader".
The literature of high-reliability systems is clear that suppression of dissenting opinions is a fast path to disaster, but this result is not widely understood, even if known.
I think it is probably correct, although not necessarily "safe" to say, that at least half of what each of us believes to be "obviously true" is, in fact, not true. The problem is, we don't know which half. And, most adults are not very happy to have one of their errors pointed out to them.
Trained and practiced collaboration in a psychologically safe environment can get around most of these problems, but is hard to do and not taught in school. Most "meetings" of "committees" at work are not best characterized as a sincere and loving search for the truth amid seemingly conflicting interests and viewpoints.
Despite the dismal track record, it is still possible to actually get consultation to work, if properly facilitated, and always worth it.
Many of the current catastrophes in the news could have been avoided entirely if widespread consultation was the norm. People would not have bought into ridiculous mortgages, for example, if they had consulted with the community first.
Why and how we have ended up at a point where asking for advice from our own people's experts, whatever people those are, is no longer "cool" is a topic for another day. Part of it is certainly tied up with a definition of "male behavior" that approves driving around for an hour lost instead of stopping to ask directions, and larger scale analogs to that activity on a departmental, corporate, state, or national level.
The title of this post, "On things you have to believe to see", is a reference to a phenomenon in machine-vision known as "model-based perception" -- which is in turn modeled after how humans perceive their visual and audio stream of data that floods their brain.
There is always way more information than can be processed by the fastest processor, much of it ambiguous or supporting conflicting interpretations, and much of the important part being of lower volume than the noise.
In response to that machines (and humans) simplify life by holding an internal mental model of the world until they can hold it no longer, and filtering the fire-hose of data down to that which resolves along the axes of that simplistic model, discarding everything else.
The good news is that, if the model is correct, or nearly correct, this approach discards the noise and keeps the signal, making life good.
The bad news is that the very same stream of data could support hundreds of thousands of interpretations equally well, and the one we have may not be even close to the best interpretation of the data.
However, any model at least lets us operate and make decisions quickly, and get feedback if we are wrong -- and in the real world, that approach usually works far better than the "paralysis of analysis" and attempting to understand everything all the time.
As thousands of studies have shown, we will tend to see what we are looking for, and tend to suppress contrary information. A radiologist, asked to look at an X-ray of a chest, will see different things if asked "Is this a tumor?" than if asked "what do you see here?" Which one is the better question "depends."
Another thing that is fatal to animals and humans, aside from inability to operate quickly, is inability to hold a course of action -- ie, dithering, or continually second-guessing an opinion. Our brains automatically try to avoid that, once we have formed a perhaps arbitrary decision, by selectively showing us data that support that decision, and selectively masking out or hiding data that contradict it. The result is, even if we are wrong, at least we are self-consistent for a while.
Confidently taking the wrong exit off the expressway is probably safer than continually changing lanes as one tries to decide if this is the right exit or not. The problem in most cases will become obvious later, and be sorted out then.
Is it a good thing that we are blind to and oblivious of our own frailty of perception and judgment? Probably, in many cases, we can at least operate at all if we act as if we knew what we were doing.
The downside is that misconceptions, errors, bias, prejudice, and hatred can all become self-fulfilling features of our lives all due to inability to perceive correctly what is going on around us, as well as due to the harsh way we often treat those whose views are contrary to our own, or foreign, or incomprehensible to us, or "clearly wrong."
This mishmash of human emotions, behaviors, perceptions, and prejudices, and norms regarding them is part of the culture that has to be over-ridden in order to establish a "lean", Toyota-Production System type high-reliability operation. To get high quality, reliable product out the back door, we need to have a psychologically safe, humble, listening culture at the front, where it's safe to say "I don't know" or "Can anyone help me?" or safe to say "Er, excuse me Doctor, I realize you are sure of your own viewpoint, but aren't we doing the LEFT arm today?"
We override such civilized culture only at high risk of taking the wrong exit, the wrong arm, or the wrong war needlessly.
No comments:
Post a Comment