Showing posts with label authority. Show all posts
Showing posts with label authority. Show all posts

Wednesday, October 28, 2009

Mindfulness and fighting wild fires, and the value of simulations for training

Professor Karl Weick at the University of Michigan has written extensively on the need for "mindfulness" in emergency situations, such as, literally, fighting forest fires.

A mindful crew or crew-chief will be aware that they are operating on a mental model, and that model may be incorrect, so they must be alert to even very small signs that they have completely misconstrued the situation.

There are lessons here, on a longer time scale, for every leader, civilian or military.

Here are some public documents on the subject.

http://www.wy.blm.gov/fireuse/2009mtg/presentations/HROs-mindfulness.ppt

Teaching Mindfulness to Wildland Firefighters (Fire Management Today, Spring 2008, Dave Thomas)

For the last 3 years I have taught half-day workshops, conducted 1-hour lectures, and provided general awareness speeches about the Weick/ Sutcliffe model of High Reliability Organizing as described in their book Managing the Unexpected: Assuring High Performance in an Age of Complexity.

This article is a series of musings, conjectures, and recommendations pulled from this teaching experience. My intent is to pass on some of the lessons that I have learned teaching High Reliability Organizing, and to pose recommendations for further study...

Today, however, mainly due to the heating of the Earth through global warming and a build up of fuels -firefighters are working within an environmental framework of weather and fuel never experienced before. Errors that we might have "got away with" in the past could more easily become catastrophic today....

Next, I explain the irrationality (mindlessness) of always learning our primary safety lessons through trial and error. It is our job to be better at anticipating errors before they occur, before a brutal audit forces us to notice the discrepant events in the fire environment. The following quotation, which reinforces this view, is taken from French disaster expert Pat Lagadec:

"The ability to deal with a crisis situation is largely dependent on structures that have been developed before chaos arrives. The event can ... be considered an abrupt brutal audit: at a moment's notice, everything that was left unprepared becomes a complex problem, and every weakness comes rushing to the forefront."...

High Reliability Organizing

NEW! France-USA High Reliability Organizing in Incident Management Teams Project
Just like NYPD detective "Popeye" Doyle, who traveled to Marseilles in the 1970s hit movie “the French Connection” so too, did a Forest Service NIMO team this past December. Only it wasn’t for crime busting this time. It was a landmark match-up between two French and American Incident Management Teams to capture what makes these teams so successful in complex, rapidly changing, stressful situations. It is hypothesized that they exhibit many of the behaviors that directly align with high reliability organizing (HRO) concepts and principles.

( More to come)


More information:

The France-USA HRO Project (French Web Site, from Bouches du Rhone with video)
http://hro-fires.com/exercices_live.html

High-Reliability Organizing - Roberts, with Weick and Sutcliff:
http://www.wildfirelessons.net/HRO.aspx

Center for Catastrophic Risk Management, Berkely CA
http://ccrm.berkeley.edu

Communication and Information technologies:
New tools for DISASTER management
Jean-Michel DUMAZ (1)
Bouches-du-Rhône Fire Department – MARSEILLE - FRANCE
2nd International Conference on Urban Disaster Reduction
November 27~29, 2007

The Bouches du Rhône
Fire Department


Wade

Monday, December 17, 2007

New life forms from Synthetic DNA - Washington Post


The Washington Post today deals with "Synthetic DNA on the brink of Creating New Life Forms." Talk about children playing with matches... Rick Weiss begins " It has been 50 years since scientists first created DNA in a test tube..." I'd add - it has also been 50 years since Jay Forrester's classic piece on "unintended consequences."

Here was my reply:

wade2 wrote:
Bio-error indeed. Maybe error-gance is the bigger threat, and very real. Our social approach to low-odds of very-high-risk accidents, as Carl Sagan pointed out re return of samples from Mars, is completely overwhelmed by our normal intuition. At Los Alamos, the first atomic bomb was tested when only a minority of the scientists on the project (something like 6 of 14) thought it would detonate the earth's crust and explode the entire planet. No one was sure, so they tested it. Hmm.

Good books like "Lethal Arrogance" by Dumas and "Normal Accidents" by Perrow detail hundreds of examples of our tendency to run it till it breaks, and then, only then, stop to think.
The tools to even begin to think about the way coupled feedback-loops get their job done, such as System Dynamics, have languished for 50 years. MIT's John Sterman, in "Business Dynamics - Systems Thinking and Modeling for a Complex World" , details the lack of correct intuition, even for the MIT community, brighter than most. PhD's don't generally help, and most of us have less to work with.

So, at best we can model and simulate, which has been done at the Santa Fe Institute for the last few decades, with "artificial life" - virtual life and virtual DNA, genetic algorithms breeding and evolving, to see what happens. http://www.santafe.edu/ describes the work of many Nobel Prize winners.

In short (1) the little buggers are far smarter than we are and (2) parasitism evolves almost instantly in every case. The lesson of the movie Jurassic Park is a mild taste of the tenet "Life will find a way."

If the rest of our human affairs were measured and mature and stable, this would be a risky business. Having unstable tyrants convinced they must "master" this technology and use it to attack others, or defend from attack (exact same research), leads to the Russian model of stockpiling hundreds of tons of Anthrax or worse, in delusions that bio-warfare would be controllable or could be "won".

There are good odds the viruses and fungi and insects will win, not so good for humans.

Life is built with interactions with emergent properties on multiple levels, and we tend to think of "machines" at one level with only one function. But genes don't work like machines, they work like cooperative swarms.

Bio-warfare research has a "life of its own" that should already put us on alert that it is way easier to create things that "might as well be alive" than we think. Since we cannot stop it, we are committed to trying to get ahead of it and get the reins back, which means we should pour billions into understanding the world that the Santa Fe Institute has pioneered - massive interactions, how they go good, and how they go bad.

It becomes clear very quickly that, with complex systems, by the time you realize you "shouldn't have done that" it's too late. Experience is something that comes just after we need it.
For very high-stakes mistakes, that's too late. If we keep gambling with the whole planet on the table, sooner or later we'll lose one turn.

One is all it takes.

12/17/2007 6:07:22 AM
=========

Actually, all the research on high-reliability systems like nuclear power plant control rooms show that the maturity of the social system is what makes or breaks the technology-based system. Psychologically safe environments are needed for people to raise their hand, without fear of reprisal, and question what the heck is going on.

What we have instead is a whole culture used to using fear as a workplace and political context to "get things done", as described by Harvard Professor Amy Edmondson.

The Shuttle Columbia (picture at left) exploded because of an "o-ring" problem, that all the project engineers knew about, and had in fact gone in that day to tell the boss to tell the White House that it was too cold to launch safely. They all lost their nerve under workplace pressure to "deliver" so the Pres could talk to an orbiting teacher during the State of the Union address. She did, in fact, leave a message for us (picture at left) of what happens when we don't listen -- but, I guess we're still not learning that lesson.

Further reading

The classic paper in this field is Jay Forrester's congressional testimony:
"The Counterintutive Behavior of Social Systems",
https://mail.jhsph.edu/exchweb/bin/redir.asp?URL=http://web.mit.edu/sdg/www/D-4468-2.Counterintuitive.pdf

Quoting the abstract:

Society becomes frustrated as repeated attacks on deficiencies in social systems lead only to worse symptoms. Legislation is debated and passed with great hope, but many programs prove to be ineffective. Results are often far short of expectations Because dynamic behavior of social systems is not understood, government programs often cause exactly the reverse of desired results.

Another quote from the Washington Post article is this:

"We're heading into an era where people will be writing DNA programs like the early days of computer programming, but who will own these programs?" asked Drew Endy, a scientist at the Massachusetts Institute of Technology.

How true that is. I've been programming computers for over 40 years, and agree that the programs they write will be exactly like the "single-threaded" programs that mess up our airline reservations and everything else. In fact, a look inside some place like a hospital reveals the workings of the multiple legacy computer systems cobbled together in absence of any fundamental theory at all of how many interacting things should be structured in order to be reliable. Thirty years of research in computer science on "distributed operating systems" and how to build reliability in has had close to zero impact on the quick and dirty, cut-corners-now-and-we'll-debug-it-later model that vendors find locally profitable, but that always breaks down, producing, ta da!, more profitable rework. As a business model it's very popular; as a way of getting reliability, we all have seen the results. This is the culture we expect to "program" our genes? I'm not rushing to sign up.

The article quotes someone on the "unprecedented degree of control of creation" that the DNA technology gives us. Right. This is about the degree of "control" that a Labrador Retriever on your lap in the car at rush-hour has -- yes, it can turn the steering-wheel, but I wouldn't use the term "control" for what happens next. If you think our economy and business development and health care system are "under control", then maybe you would think genes could be "controlled" the same way - and they can, with about the same results.

Sadly, control requires maturity and depth of understanding, instead of simply strong muscles and a short attention span. I wish it were our strong suit as a nation, but see little evidence that it is, or even that it is valued or desired as a long-term goal.

We have instead young children playing with the cool gun they found in daddy's nightstand.

Oops.

======= Some after-thoughts:

Unlike the video games and computers this generation grew up with, life does not always have an "undo" button.

The core task of a civilization is to capture the wisdom we finally learn too late, and get it into a form that modifies the behavior of the next generation so those same lessons don't have to be learned all over again.

The hardest part of that task is that the next generation typically doesn't want to take advice from old people about situations the village elders seem way too concerned about - like, not going into debt over your head, you know, crazy stuff like that.

George Santayana said "Those who cannot learn from history are doomed to repeat it." I'd modify that slightly and add "Those who cannot learn from near-misses will someday not miss."

Each time we don't learn this, as a society, the costs go up. The biggest unknown in "the Drake Equation" about odds of there being other intelligent life in the galaxy that we could detect with radio is how long a civilization survives after it has gotten to the point where it has that much technology. The complete absence of any detectable signals from 100 trillion worlds "out there" suggests this is a pretty small number of years -- maybe under 200 years.

At the rate we're going, we're heading towards adding one more point to that data set.
Learning how to learn from our mistakes and our own past seems to be as important a problem as global warming, but actually more urgent, because time is running out a little faster on the 400,000 ways, besides global warming, that we can end human life on the planet.

Humans are remarkably inventive, and if every weapon and sharp object on the planet vanished, they'd find ways to attack each other with stones. Instead of tackling each symptom like global warming or genocide or terrorism, it would seem wiser to track further upstream and find the root-cause problem for why people are driven to fight, and fix that.

======================================

More further reading:

On High Reliablity organizations, which are sobering. They try really really hard to not have accidents, and still don't succeed from time to time:

http://www.highreliability.org/

I'm sure the US military tries very hard to keep nuclear weapons under control. Even that intense level of attention isn't enough to do the job 100% of the time, illustrating John Gall's law that "complex systems simply find complex ways of failing."

"Honey, I lost the nuclear weapons"

The US National Institutes of Medicine on how much the social relations of the front-line teams matter when your job is to get reliability in hospital care:

Crossing the Quality Chasm and other links

=========================
Photo credits :
Oops (car) by
estherase
US Space Shuttle by
Andrew Coulter Enright

Tuesday, October 02, 2007

Encouraging Dissent in Decision-Making


"Our natural tendency to maintain silence and not rock the boat, a flaw at once personal and organizational, results in bad—sometimes deadly—decisions. Think New Coke, The Bay of Pigs, and the Columbia space shuttle disaster, for starters. Here's how leaders can encourage all points of view."

That's how Harvard Business School Professor Amy Edmondson describes her paper "Encouraging Dissent in Decision-Making."

In this and other papers she describes how a culture of fear and anxiety in American business organizations effectively suppresses both dissent and innovation, resulting in deadening places to work that are not competitive and neither agile nor adaptive.

A good paper of hers that is not listed there is "Speaking Up in the Operating Room: How Team Leaders Promote Learning in Interdisciplinary Action Teams", Journal of Management Studies 40:6, September 2003. Here she follows up the same thread of work that got Dr. Peter Pronovost of Johns Hopkins the Eisenberg award -- figuring out how to let nurses be heard when they saw something that was out of place in the operating room, where historically their voice was neither welcomed nor heard.

As with the Army Leadership Field Manual (FM22-100), the challenge for organizational revitalizing coaches is to disentangle the lines of authority (meaning command) from the lines of authority (meaning confirmed knowledge or eyes from boots on the ground.)

The elitist British culture in the 1800's gave us a management model where the human beings in "management" were considered genetically superior to the other life forms called "labor", giving management unique skills to have a monopoly on all wisdom and thereby deserve all authority.

In the 21st century, organizations are so large, so rapidly changing, and so complex that the sources of wisdom have to be eyes at the front, and "management" is always playing catch-up with a legacy mental model that is running behind. The "higher" up the chain of command managers are, the more removed they are from the reality at the front.

General Colin Powell said once that, if a General in Washington and a soldier at the front-line disagreed on a fact, he'd side with the soldier as having more current information.

The problem is that, with authority-type-1 (power to issue legitimate orders to others) tangled up with authority-type-2 (sight and possibly insight as to what's going on in the real world), management too often perceives a challenge to authority-type-2 as an insubordinate challenge to authority-type-1, and quickly moves to "put down the rebellion."

The result is to blind upper management entirely, which now lives in a mental model detached from reality, spinning off out of control and clueless as to why their actions are proving ineffective or counter-productive, since, by their understanding of the situation out there, what they are doing should have worked.

Anyway, the military has worked this out, at least the concept, and FM22-100 is a superb description of how an organization can retain authority-type-1 (central command) and open up and delegate authority-type-2 (new eyes with surprising news that may totally revise the picture of what's going on outside.)

Other organizations, such as hospitals, might be able to learn something from how the Army figured out how to disentangle those two concepts.

In my mind, it is simply a "vertical loop" where there are two pipes, not one. Commands come down one pipe from above, and news about reality, particular surprising news that central command's picture of the ground needs updating, go up a different pipe, and the two not only don't interfere, but form a loop that makes them amplify each other.

This is a single cybernetic loop, a "clothesline loop over a pulley at each end" and the more each side PULLS on it, the more the other side goes the direction they want it go to.

So, the guys on the ground have to PULL on the rope, and willing accept orders from above, while at the same time the guys on the top have to PULL on the other side of the rope, and willingly accept updates to their mental model of the situation from below.

The whole thing breaks down if either side fails to do their job. If Generals issue orders, but never listen to what the result was, the result is always defeat on the battlefield. If soldiers want a say in what's going on and being decided to do next, but don't want to listen to the resulting stream of orders, that breaks down too.

But if both sides do their jobs, soldiers listen to orders (authority-1, the down-going rope), and generals listen to soldiers (authority-2, the up-going rope), then after a transitional period where trust is being built and this is becoming "phase-locked" and synchronized, we have the full power of cybernetic control available to the organization -- the best of both worlds.

The transitional period to this model can be helped, I think, if what "lean" calls "the final state" is clear to everyone in advance. Lean production thinking ("The Toyota Way") is described by people such as Professor Jeff Liker very strongly based on "philosophy", a term that is largely discounted and meaningless to Americans today.

A better word, the word used in FM22-100, is "Doctrine". That word also has a bad flavor to American culture that worships "freedom", but some kinds of freedom are in the way of success. Runners with rigid bones can move faster than jellyfish. It's a nuanced subject, this rigor versus local-rigidity-with-pivots. But, even more so than "doctrine" come the other dreaded words - "discipline" and "standards."

The US Army has worked its way through those nuances, disentangled the different meanings of authority, and, to the extent their doctrine of accepting both command from above and "dissenting views and challenges to the model" from below is utilized, they are basically unstoppable in their mission.

They can still be defeated if their Doctrine is broken by leaders at the top who want to pick and choose, keeping the "giving orders to below" part, but discarding the "getting updates from below" part. That's not a failure of the Doctrine, it's a failure to follow the Doctrine.

The cybernetic "clothesline" only works if the up channel and down channel are both working and mutually supportive. Some of the first orders downward have to be "send more dissent upwards! We can't hear you!"

This is the nature of the problem that Professor Amy Edmondson researches, that started this post. How to overcome fear of speaking up and empower workers to dissent.

But, use nuances please. Dissent-type-1 (disagreeing with the mental model) needs to be ENcouraged. Dissent-type-2 (disagreeing with a command structure existing or my role in complying with it) is to be DIScouraged, as always.

The command structure gains credibility and strength to the extent that the commands reflect good judgment based on good data, and the only source of the data are the boots on the ground at the front. If the soldiers keep their place (and listen to commands) and the generals keep their place (and listen to advice), it comes together and works.

The most common mode of failure appears to be generals who mistake a fraudulent silence, caused by suppression of dissent-type-1, as agreement with their mental model, and then keep on issuing orders that are detached from reality, resulting in contempt for the whole system and ultimately a collapse in the command structure entirely, let alone a military defeat.

======== afterward

I realized after I posted this that many middle-class suburban children have never actually seen a clothes-line these days. They've grown up with gas or electric dryers, and clothes-lines are prohibited as being tacky or lower-class by suburban Covenants and Restrictions for housing developments.

So, I put a picture of one above. (source: Blessings in the South.) It was remarkably hard to find this picture. This seems to be a "simple machine" of incredible importance for insight that today's generations don't even have in their mental toolbox or vocabulary.

For those who have never seen this in action, there is a loop of rope strung between two pulleys, one on each end attached to poles. In the picture above the lady has three such loops.
She stands in one place and hangs a sheet, say, over one side of the rope, clips it on with clothespins, and then pulls the other side of the loop toward her, which pulls the sheet she just hung away from her, opening up a new spot for the next thing to be hung. That way she doesn't have to move the basket of clothes or herself.

The rule of thumb, of course, is that "You can't push with a rope." Yet, with a loop of rope, effectively you CAN push with a rope, by pulling.

This is the magic of the vertical management loop in the Army Doctrine. To "push" your advice upwards, which is "impossible" as it is pushing on a rope, you "pull" on the commands coming downward. And at the top, to "push" your commands downward (also seemingly impossible), you clear the way by "pulling" the reactions and comments from the troops upwards.

Neither side can cheat here - the rope has to be continuous, and trust in it has to build over time, but then, this model actually does work for the US Army. They can balance very strict command with very good intelligence, overcoming the old saw that "army intelligence" was an oxymoron.

The same principle could, in principle, be applied to government in general ("government intelligence") and corporate management in general. The same rules apply. If both sides do their jobs, it works, and victory is possible. If either side only wants to "push" and doesn't want to "pull", the whole thing breaks down and becomes dysfunctional and defeat is likely.

Sunday, June 24, 2007

What I learned at Johns Hopkins last week



Well, I saw something completely unexpected yesterday.

I wasn't posting here for most of last week because on Friday I completed a course, "Social and Behavioral Aspects of Public Health", at Johns Hopkins School of Public Health. I thought it was a good course and covered many key ideas, although I did wish it had gone into them in a little more depth.


But, I am a finishing 3rd year student, (my last class! Hooray!). and most of the class had just started two weeks ago, so I could understand the need to not overwhelm people with new concepts. And that's what I thought was happening, but now I'm not so sure.

This is like those scenes in the movies where the music changes and everyone knows that the monster is approaching but our hero and heroine happily play on, oblivious.

During lectures, sometimes we would have a simple summary slide with content such as "Poverty is a carcinogen." We were supposed to evaluate that assertion, tease it apart, sort out what portions were true and how you could tell. This is part of a debate that's been raging for at least 400 years.

Many of these lectures were met with a startling silence by the students, who often had no questions at all. This surprised me as I thought there would at least be a heated discussion. Well, I thought, they're tired from working half the night on their classwork, or don't want to ask "dumb questions."

Still it was eerie to have the professor ask something and the room of 100 or so just sit there.

After the class, in the big blue shuttle to Baltimore - Washington BWI airport, I discovered something I wish I'd known the first day, as it would have totally changed my behavior.

I chanced to ride with another MPH student I recognized and asked her what she thought of the class we'd just had. I hit a nerve. She had thought the class was a total waste of time and money, and put up with it just because it was required. She thought, basically, that the lessons the class taught were stupid, wrong-headed, wrong, soft, politically-motivated, you name it, and she had already discarded all of her notes. She was just livid.

Wow. None of that had come out in class. And, obviously, "my mileage varied." I liked the course and I don't think I'm an easy sell. I'm used to executive education programs where "students", often CEO's of companies, wouldn't hesitate a second to challenge something they disagreed with.

Apparently I had fallen into the common trap of interpreting stony silence as agreement, or consent. In point of fact, it was total disagreement and scorn, suppressed by a need to just complete the required course, hold one's breath, and put up with all this "psycho-babble" for two weeks. (She didn't say "psycho-babble", but could have.)

So we had missed a tremendous teaching opportunity to get this debate and dispute out on the table and have at it. What a great opportunity to get our feet wet on what it means to assert that "A causes B", and how we "prove" things, and what level of skepticism is expected, and what the burden of proof is on someone asserting some new claim, and how to meet that burden, etc.

It would have been a perfect chance to show a snippet of Crime Scene Investigator's CSI TV show where CSI Head Gil Grissom could lecture us all on the need to suspend our suspicions and "let the data talk." We could have viewed a few cases where it was way too easy to believe that Mr. Jones obviously "did it" when, in fact, it was Miss Smith, in the Kitchen, with a lead pipe.

We could have talked about how civilized grown ups in the field disagree with each other's conclusions while remaining cordial and committed to careful ways to defend against being too gullible (a "type I error") or too skeptical (a "type II error").

But, at least for this one student, that chance was missed. She had interpreted this class as just one more of those annoying things in life where a person in authority states or does something stupid and the best thing to do is just shut up and pretend you agree. In fact, silent and sullen obedience is the expected and demanded and rewarded behavior.

I guess it was rewarded here too, because I think she "passed." Hmm.

Way too many years ago, before I had taught in trade school or taught MBA's, a book came out titled "Summerhill", I think. It described a school in England that I actually went to go visit because of the book. The school challenged the prevailing "infectious disease" notion that I can recall quite well:
Courses are something like the measles. They are something you "have", and then, since you've "had it" you don't need to "have it again."
Again, wow. I had thought that concept had died in the 60's. It seems to be resurgent. Or maybe it never left and I'm just finally looking up and noticing it.

Now, I'm the first to agree that I went into undergraduate Engineering at Cornell, after reading C.P. Snow's Two Cultures, because I just couldn't figure out how to deal with classes where the teacher would ask "What did Hemmingway mean when he said X?" and I had no idea what to say next after I offered an opinion and the teacher told me I was "wrong". What the heck? What's with that?

At least in Engineering, if you say something should work and someone else says "No, it shouldn't" you can just both happily go down to the lab machine-shop and build one and just see whether it flies or not. No one ever wastes time talking about the "true nature of causality."

We'd just happily compute what size resistor to put at this point in a circuit without losing sleep over what the meaning was of "resistance" and if we could actually be "certain" that changing the value would have the desired impact on the radio receiver actually working. If in doubt, put in a variable resistance potentiometer ("a pot") and turn the screw to change the value while watching the output on an oscilloscope, and when you got it where you wanted it, Bingo, pull out the "pot"and measure what resistance it was set to and solder a permanent resistor of that size into the circuit and go play volleyball. No big deal.

Maybe it's because I'm looking at social issues more than I used to, or maybe it's because society is changing, but that sort of way of gaining an answer to a question seems to be vanishing as the expected behavior of people.

Without some training and skill in the tools of Public Health, or other rigorous but often qualitative fields, we've reverted back to the Middle Ages where causality is either magical or determined by which "authority" one follows blindly.

Again, wow.

So, if I hold out my pencil and release it, and it falls to the ground, and I ask "Why does that happen?" I'm as likely to hear "God made it move" as "Gravity."

So, hmm. Is this an "either/or" question or an "and" question or what? Personally, I prefer to think that "gravity" made the pencil move, and allow that, if you like, you can add "... and God made gravity." At least with the "theory of gravity" I can write some equations, design equipment, know exactly how fast something will fall, plot trajectories, etc. It's a "theory with meat on the bones" that I can rely on to build stuff that works. I don't get much "predictive value" out of "God made it move."

But, I guess if you never had the math, and never did "get" introductory Physics, and the concept of "potential energy" baffled you, so the equations never gave you any insight or power, then it's pretty much equivalent to you to say "God made it move" or "magic made it move" or "gravity made it move." They're all invisible anyway, right?

I was busy raising children and missed the whole 80's and 90's trend towards cultural relativism applied to everything, including physical laws, so that "your idea of how gravity works is no better than anyone else's" and we should agree to let all three just get along - physics, magic, or God."

Besides, frankly, hey, when you get right down to it, I can't "see" gravity anyway. All I can "see" is the pencil. Your invisible force against my invisible force, it's a tie, right?

All of which gets me back to class. I guess we might need to add an introductory class that we never needed before, to socialize students to an accepted way of challenging assertions and assumptions and accepted ways of meeting the burden of proof without being blindly stubborn or gullible about it. We need to know when and how it's appropriate to raise our hand and say "How can you prove that?" in a neutral, polite, but insistent tone.

As about a zillion (technical term) of my previous posts discussed, a key requirement for a "high-reliability" culture is "mindfulness," which requires the ability and sensed-permission and sensed-expectation that you will surface questions you have, not submerge and suppress them.

If we can't have that discussion first, all the rest of this business with models and hypothesis testing and "p-values" and study design and statistical tests is, indeed, just magical rituals that you have to go through for some stupid legacy reason in order to get published. All this demand for "evidence-based" practice is just a waste of time, then. No wonder students are baffled by it.

Well, we all know the rule that "All Indians walk single-file .... at least the one I saw did."

So, I'm extrapolating to an entire entering class of students from observed puzzling behavior of stony silence and from one accidentally chosen student's opinion in a cab on the way to the airport. That suggests an underlying teaching opportunity that maybe I'm imagining or maybe is real.

How would we decide which it is?

I'm concerned that not a single student challenged the teachings and yet clearly, from this and other conversations, many others I checked with also disagreed -- in complete silence.

Again, wow. And these are all students with undergraduate degrees and at least two years work experience and decent GRE scores. Maybe a third of them are already Medical Doctors. (MD's)

What have we done? Can we trace this defect back upstream and find out where it's coming from? (You can click on this next image to zoom up to a readable scale).


And how can we undo it? And how could we measure our impact and know whether we had succeeded or not?

Those are good Public Health questions that deserve some time on the agenda. They're also major business problems that directly short-circuit techniques like "The Toyota Way" that I've discussed, that require that everyone should work with their eyes open and with permission, and even expectation, that they'll spot things that need to be changed and announce them.

An army of silent, obedient, sullen, blind robot lemmings is not a very solid basis on which to build a competitive economy or a good public health infrastructure that actually works, or an army that works, or anything that works, instead of one that everyone pretends works.

What have we done to our children?