Comments on life, science, business, philosophy, and religion from my personal public health viewpoint
Thursday, November 27, 2008
OK, seriously, WHY didn't we see it coming?
My comment in response to Paul Krugman's NY Times column today, "Lest We Forget".
========================================
Your question is superb - How did those at the top not see this coming, or take it seriously, despite many stifled voices below pointing at it in alarm?
Yes, if financial things broke on this shoal, fix the financial things.
But, at the same time, this shoal has got to go, or it will just demolish the repair effort in a never-ending cycle of "How did that happen? Fix and forget."
This exact problem is well known and well documented by everyone, across industries, government agencies, auto companies, universities, etc. This process is ALSO broken, and needs to be addressed, by as many billion dollars as spent repairing the damage it caused.
Social decision making processes are no more abstract than financial markets, but get no respect, being in a higher leverage, further upstream, less visible place in the chain of events.
High-reliability human systems have been studied extensively, from Chernobyl to The Bay of Pigs to Challenger to aircraft cockpit teams to hospital surgical teams to the US Army Leadership Field Manual. The answer always comes down to the same thing -- dissenting views need to be heard, and dissenters need what Harvard Professor Amy Edmondson calls "Psychological Safety" or they will wilt and become ineffective. This is how humans always behave and unless steps are taken it always breaks along this fault line.
The right question then should be, who is going to take charge of seeing that those steps are taken and that level of social literacy achieved?
I can't emphasize enough how much more important this is than more math and science, in the absence of this. As T.S. Eliot said, we repeatedly get burned "dreaming of systems so perfect that no one will need to be good, but the man that is will shadow the man that pretends to be."
Much of my weblog is about what we really need to do to avoid such errors in judgment. I can only hope the right person wakes up and reads it and the links to sources such as MIT's papers or John Sterman's work on how poorly we can see systems that involve feedback.
"Why we have so much trouble seeing" (and what to do about it.)
http://newbricks.blogspot.com...
(photo by myself - "Fixed at last!" )
Monday, December 17, 2007
New life forms from Synthetic DNA - Washington Post

The Washington Post today deals with "Synthetic DNA on the brink of Creating New Life Forms." Talk about children playing with matches... Rick Weiss begins " It has been 50 years since scientists first created DNA in a test tube..." I'd add - it has also been 50 years since Jay Forrester's classic piece on "unintended consequences."
Here was my reply:
Good books like "Lethal Arrogance" by Dumas and "Normal Accidents" by Perrow detail hundreds of examples of our tendency to run it till it breaks, and then, only then, stop to think.
The tools to even begin to think about the way coupled feedback-loops get their job done, such as System Dynamics, have languished for 50 years. MIT's John Sterman, in "Business Dynamics - Systems Thinking and Modeling for a Complex World" , details the lack of correct intuition, even for the MIT community, brighter than most. PhD's don't generally help, and most of us have less to work with.
So, at best we can model and simulate, which has been done at the Santa Fe Institute for the last few decades, with "artificial life" - virtual life and virtual DNA, genetic algorithms breeding and evolving, to see what happens. http://www.santafe.edu/ describes the work of many Nobel Prize winners.
In short (1) the little buggers are far smarter than we are and (2) parasitism evolves almost instantly in every case. The lesson of the movie Jurassic Park is a mild taste of the tenet "Life will find a way."
If the rest of our human affairs were measured and mature and stable, this would be a risky business. Having unstable tyrants convinced they must "master" this technology and use it to attack others, or defend from attack (exact same research), leads to the Russian model of stockpiling hundreds of tons of Anthrax or worse, in delusions that bio-warfare would be controllable or could be "won".
There are good odds the viruses and fungi and insects will win, not so good for humans.
Life is built with interactions with emergent properties on multiple levels, and we tend to think of "machines" at one level with only one function. But genes don't work like machines, they work like cooperative swarms.
Bio-warfare research has a "life of its own" that should already put us on alert that it is way easier to create things that "might as well be alive" than we think. Since we cannot stop it, we are committed to trying to get ahead of it and get the reins back, which means we should pour billions into understanding the world that the Santa Fe Institute has pioneered - massive interactions, how they go good, and how they go bad.
It becomes clear very quickly that, with complex systems, by the time you realize you "shouldn't have done that" it's too late. Experience is something that comes just after we need it.
For very high-stakes mistakes, that's too late. If we keep gambling with the whole planet on the table, sooner or later we'll lose one turn.
One is all it takes.
=========
Actually, all the research on high-reliability systems like nuclear power plant control rooms show that the maturity of the social system is what makes or breaks the technology-based system. Psychologically safe environments are needed for people to raise their hand, without fear of reprisal, and question what the heck is going on.

What we have instead is a whole culture used to using fear as a workplace and political context to "get things done", as described by Harvard Professor Amy Edmondson.
The Shuttle Columbia (picture at left) exploded because of an "o-ring" problem, that all the project engineers knew about, and had in fact gone in that day to tell the boss to tell the White House that it was too cold to launch safely. They all lost their nerve under workplace pressure to "deliver" so the Pres could talk to an orbiting teacher during the State of the Union address. She did, in fact, leave a message for us (picture at left) of what happens when we don't listen -- but, I guess we're still not learning that lesson.
Further reading
The classic paper in this field is Jay Forrester's congressional testimony:
"The Counterintutive Behavior of Social Systems",
https://mail.jhsph.edu/exchweb/bin/redir.asp?URL=http://web.mit.edu/sdg/www/D-4468-2.Counterintuitive.pdf
Quoting the abstract:
Society becomes frustrated as repeated attacks on deficiencies in social systems lead only to worse symptoms. Legislation is debated and passed with great hope, but many programs prove to be ineffective. Results are often far short of expectations Because dynamic behavior of social systems is not understood, government programs often cause exactly the reverse of desired results.
Another quote from the Washington Post article is this:
"We're heading into an era where people will be writing DNA programs like the early days of computer programming, but who will own these programs?" asked Drew Endy, a scientist at the Massachusetts Institute of Technology.
The article quotes someone on the "unprecedented degree of control of creation" that the DNA technology gives us. Right. This is about the degree of "control" that a Labrador Retriever on your lap in the car at rush-hour has -- yes, it can turn the steering-wheel, but I wouldn't use the term "control" for what happens next. If you think our economy and business development and health care system are "under control", then maybe you would think genes could be "controlled" the same way - and they can, with about the same results.
Sadly, control requires maturity and depth of understanding, instead of simply strong muscles and a short attention span. I wish it were our strong suit as a nation, but see little evidence that it is, or even that it is valued or desired as a long-term goal.
We have instead young children playing with the cool gun they found in daddy's nightstand.
Oops.
======= Some after-thoughts:
The hardest part of that task is that the next generation typically doesn't want to take advice from old people about situations the village elders seem way too concerned about - like, not going into debt over your head, you know, crazy stuff like that.
George Santayana said "Those who cannot learn from history are doomed to repeat it." I'd modify that slightly and add "Those who cannot learn from near-misses will someday not miss."
Each time we don't learn this, as a society, the costs go up. The biggest unknown in "the Drake Equation" about odds of there being other intelligent life in the galaxy that we could detect with radio is how long a civilization survives after it has gotten to the point where it has that much technology. The complete absence of any detectable signals from 100 trillion worlds "out there" suggests this is a pretty small number of years -- maybe under 200 years.
At the rate we're going, we're heading towards adding one more point to that data set.
Humans are remarkably inventive, and if every weapon and sharp object on the planet vanished, they'd find ways to attack each other with stones. Instead of tackling each symptom like global warming or genocide or terrorism, it would seem wiser to track further upstream and find the root-cause problem for why people are driven to fight, and fix that.
======================================
More further reading:
On High Reliablity organizations, which are sobering. They try really really hard to not have accidents, and still don't succeed from time to time:
http://www.highreliability.org/
I'm sure the US military tries very hard to keep nuclear weapons under control. Even that intense level of attention isn't enough to do the job 100% of the time, illustrating John Gall's law that "complex systems simply find complex ways of failing."
"Honey, I lost the nuclear weapons"
Crossing the Quality Chasm and other links
Photo credits :
Oops (car) by estherase
US Space Shuttle by Andrew Coulter Enright
Tuesday, October 02, 2007
Encouraging Dissent in Decision-Making
"Our natural tendency to maintain silence and not rock the boat, a flaw at once personal and organizational, results in bad—sometimes deadly—decisions. Think New Coke, The Bay of Pigs, and the Columbia space shuttle disaster, for starters. Here's how leaders can encourage all points of view."
That's how Harvard Business School Professor Amy Edmondson describes her paper "Encouraging Dissent in Decision-Making."
In this and other papers she describes how a culture of fear and anxiety in American business organizations effectively suppresses both dissent and innovation, resulting in deadening places to work that are not competitive and neither agile nor adaptive.
A good paper of hers that is not listed there is "Speaking Up in the Operating Room: How Team Leaders Promote Learning in Interdisciplinary Action Teams", Journal of Management Studies 40:6, September 2003. Here she follows up the same thread of work that got Dr. Peter Pronovost of Johns Hopkins the Eisenberg award -- figuring out how to let nurses be heard when they saw something that was out of place in the operating room, where historically their voice was neither welcomed nor heard.
As with the Army Leadership Field Manual (FM22-100), the challenge for organizational revitalizing coaches is to disentangle the lines of authority (meaning command) from the lines of authority (meaning confirmed knowledge or eyes from boots on the ground.)
The elitist British culture in the 1800's gave us a management model where the human beings in "management" were considered genetically superior to the other life forms called "labor", giving management unique skills to have a monopoly on all wisdom and thereby deserve all authority.
In the 21st century, organizations are so large, so rapidly changing, and so complex that the sources of wisdom have to be eyes at the front, and "management" is always playing catch-up with a legacy mental model that is running behind. The "higher" up the chain of command managers are, the more removed they are from the reality at the front.
General Colin Powell said once that, if a General in Washington and a soldier at the front-line disagreed on a fact, he'd side with the soldier as having more current information.
The problem is that, with authority-type-1 (power to issue legitimate orders to others) tangled up with authority-type-2 (sight and possibly insight as to what's going on in the real world), management too often perceives a challenge to authority-type-2 as an insubordinate challenge to authority-type-1, and quickly moves to "put down the rebellion."
The result is to blind upper management entirely, which now lives in a mental model detached from reality, spinning off out of control and clueless as to why their actions are proving ineffective or counter-productive, since, by their understanding of the situation out there, what they are doing should have worked.
Anyway, the military has worked this out, at least the concept, and FM22-100 is a superb description of how an organization can retain authority-type-1 (central command) and open up and delegate authority-type-2 (new eyes with surprising news that may totally revise the picture of what's going on outside.)
Other organizations, such as hospitals, might be able to learn something from how the Army figured out how to disentangle those two concepts.
In my mind, it is simply a "vertical loop" where there are two pipes, not one. Commands come down one pipe from above, and news about reality, particular surprising news that central command's picture of the ground needs updating, go up a different pipe, and the two not only don't interfere, but form a loop that makes them amplify each other.
This is a single cybernetic loop, a "clothesline loop over a pulley at each end" and the more each side PULLS on it, the more the other side goes the direction they want it go to.
So, the guys on the ground have to PULL on the rope, and willing accept orders from above, while at the same time the guys on the top have to PULL on the other side of the rope, and willingly accept updates to their mental model of the situation from below.
The whole thing breaks down if either side fails to do their job. If Generals issue orders, but never listen to what the result was, the result is always defeat on the battlefield. If soldiers want a say in what's going on and being decided to do next, but don't want to listen to the resulting stream of orders, that breaks down too.
But if both sides do their jobs, soldiers listen to orders (authority-1, the down-going rope), and generals listen to soldiers (authority-2, the up-going rope), then after a transitional period where trust is being built and this is becoming "phase-locked" and synchronized, we have the full power of cybernetic control available to the organization -- the best of both worlds.
The transitional period to this model can be helped, I think, if what "lean" calls "the final state" is clear to everyone in advance. Lean production thinking ("The Toyota Way") is described by people such as Professor Jeff Liker very strongly based on "philosophy", a term that is largely discounted and meaningless to Americans today.
A better word, the word used in FM22-100, is "Doctrine". That word also has a bad flavor to American culture that worships "freedom", but some kinds of freedom are in the way of success. Runners with rigid bones can move faster than jellyfish. It's a nuanced subject, this rigor versus local-rigidity-with-pivots. But, even more so than "doctrine" come the other dreaded words - "discipline" and "standards."
The US Army has worked its way through those nuances, disentangled the different meanings of authority, and, to the extent their doctrine of accepting both command from above and "dissenting views and challenges to the model" from below is utilized, they are basically unstoppable in their mission.
They can still be defeated if their Doctrine is broken by leaders at the top who want to pick and choose, keeping the "giving orders to below" part, but discarding the "getting updates from below" part. That's not a failure of the Doctrine, it's a failure to follow the Doctrine.
The cybernetic "clothesline" only works if the up channel and down channel are both working and mutually supportive. Some of the first orders downward have to be "send more dissent upwards! We can't hear you!"
This is the nature of the problem that Professor Amy Edmondson researches, that started this post. How to overcome fear of speaking up and empower workers to dissent.
But, use nuances please. Dissent-type-1 (disagreeing with the mental model) needs to be ENcouraged. Dissent-type-2 (disagreeing with a command structure existing or my role in complying with it) is to be DIScouraged, as always.
The command structure gains credibility and strength to the extent that the commands reflect good judgment based on good data, and the only source of the data are the boots on the ground at the front. If the soldiers keep their place (and listen to commands) and the generals keep their place (and listen to advice), it comes together and works.
The most common mode of failure appears to be generals who mistake a fraudulent silence, caused by suppression of dissent-type-1, as agreement with their mental model, and then keep on issuing orders that are detached from reality, resulting in contempt for the whole system and ultimately a collapse in the command structure entirely, let alone a military defeat.
======== afterward
I realized after I posted this that many middle-class suburban children have never actually seen a clothes-line these days. They've grown up with gas or electric dryers, and clothes-lines are prohibited as being tacky or lower-class by suburban Covenants and Restrictions for housing developments.
So, I put a picture of one above. (source: Blessings in the South.) It was remarkably hard to find this picture. This seems to be a "simple machine" of incredible importance for insight that today's generations don't even have in their mental toolbox or vocabulary.
For those who have never seen this in action, there is a loop of rope strung between two pulleys, one on each end attached to poles. In the picture above the lady has three such loops.
She stands in one place and hangs a sheet, say, over one side of the rope, clips it on with clothespins, and then pulls the other side of the loop toward her, which pulls the sheet she just hung away from her, opening up a new spot for the next thing to be hung. That way she doesn't have to move the basket of clothes or herself.
The rule of thumb, of course, is that "You can't push with a rope." Yet, with a loop of rope, effectively you CAN push with a rope, by pulling.
This is the magic of the vertical management loop in the Army Doctrine. To "push" your advice upwards, which is "impossible" as it is pushing on a rope, you "pull" on the commands coming downward. And at the top, to "push" your commands downward (also seemingly impossible), you clear the way by "pulling" the reactions and comments from the troops upwards.
Neither side can cheat here - the rope has to be continuous, and trust in it has to build over time, but then, this model actually does work for the US Army. They can balance very strict command with very good intelligence, overcoming the old saw that "army intelligence" was an oxymoron.
The same principle could, in principle, be applied to government in general ("government intelligence") and corporate management in general. The same rules apply. If both sides do their jobs, it works, and victory is possible. If either side only wants to "push" and doesn't want to "pull", the whole thing breaks down and becomes dysfunctional and defeat is likely.