For Understanding, Look At Belief Dynamics
Tue Mar 16 10:16:45 GMT 2010 by Wade Schuette
http://newbricks.blogspot.com
http://newbricks.blogspot.com
"All models are wrong, but some models are useful." (George Box, statistician).
In assessing belief systems I suspect it would be useful to look less at what belief system a person holds and more at how that belief system behaves when presented with dissonant or contrary evidence.
If everyone could stop mud-slinging between "religion" and "science" for a moment, and look at the camps "mind is closed to contrary data" and "mind is open to contrary data" I think we'd find much more common ground and the social value sought in the term "logical" or "sensible" or "rational" or "defensible" thinking.
Also, I think that the "open mind" and "closed mind" categories are not the sole rights of any given camp. We have only to look at the history of "paradigm shifts" and the huge battles over "plate tectonics" or "jumping genes" or "epigenetic inheritance" to see that there are a great many "scientists" who are "closed minded".
The work of Professor Karl Weick and others in "mindfulness" and the whole field of safety engineering has a vast literature on how hard it is for some people to hear contrary data or give it a "fair hearing". Pilots discount copilot's dissenting view on clearnaces, surgeons discount nurses dissenting view on which kidney we're removing today, governmental and corporate leaders discount pretty much everything besides their own pet mental models.
One stable test in psychology, the "Wisconsin Card Sorting" test, measures how well people, once they have learned a set of rules, can adjust to changed circumstances and different rules.
Having studied both religion and science for the last 4 decades, I am astounded how tightly scientists cling to arbitrary oversimplification guidelines, such as "if you can't express it in mathematics it isn't real." Solid qualitative reasoning until the development of powerful computer modeling seems to have been left to the theologians and social scientists who deal with far more complex systems than the so called "hard sciences" and don't have the joy of reducing complexity to things that can be easily measured or disconnected from context non-destructively.
That said, beliefs and perceptions and even "biological health" are increasingly understood to be a function not just of "a person" but also of their living social context of other people. The edges of this "person" abstraction are not nearly as solid and clean as the classic machine-based scientific model posited. (or should I say, closed-mindedly assumed as so obvious that anyone not demented could see it.)
Does a belief system have room for "uncertainty?" Can a person, or manager, or political leader say "I just don't know" or are there huge social forces that reinforce training to pretend they do?
Science and religion each have their share of closed-minded bigots of particular mental models. THAT is our common enemy, not "science" and not "religion".
In assessing belief systems I suspect it would be useful to look less at what belief system a person holds and more at how that belief system behaves when presented with dissonant or contrary evidence.
If everyone could stop mud-slinging between "religion" and "science" for a moment, and look at the camps "mind is closed to contrary data" and "mind is open to contrary data" I think we'd find much more common ground and the social value sought in the term "logical" or "sensible" or "rational" or "defensible" thinking.
Also, I think that the "open mind" and "closed mind" categories are not the sole rights of any given camp. We have only to look at the history of "paradigm shifts" and the huge battles over "plate tectonics" or "jumping genes" or "epigenetic inheritance" to see that there are a great many "scientists" who are "closed minded".
The work of Professor Karl Weick and others in "mindfulness" and the whole field of safety engineering has a vast literature on how hard it is for some people to hear contrary data or give it a "fair hearing". Pilots discount copilot's dissenting view on clearnaces, surgeons discount nurses dissenting view on which kidney we're removing today, governmental and corporate leaders discount pretty much everything besides their own pet mental models.
One stable test in psychology, the "Wisconsin Card Sorting" test, measures how well people, once they have learned a set of rules, can adjust to changed circumstances and different rules.
Having studied both religion and science for the last 4 decades, I am astounded how tightly scientists cling to arbitrary oversimplification guidelines, such as "if you can't express it in mathematics it isn't real." Solid qualitative reasoning until the development of powerful computer modeling seems to have been left to the theologians and social scientists who deal with far more complex systems than the so called "hard sciences" and don't have the joy of reducing complexity to things that can be easily measured or disconnected from context non-destructively.
That said, beliefs and perceptions and even "biological health" are increasingly understood to be a function not just of "a person" but also of their living social context of other people. The edges of this "person" abstraction are not nearly as solid and clean as the classic machine-based scientific model posited. (or should I say, closed-mindedly assumed as so obvious that anyone not demented could see it.)
Does a belief system have room for "uncertainty?" Can a person, or manager, or political leader say "I just don't know" or are there huge social forces that reinforce training to pretend they do?
Science and religion each have their share of closed-minded bigots of particular mental models. THAT is our common enemy, not "science" and not "religion".