This post is inspired by a tweet from Tese Stephens ( Teresa M. Stephens, PhD, MSN, RN, CNE ) reflecting her current work on "moral distress" in nursing.
The subject of abusive contexts in teams or organizations is tangled with the subject of "psychological safety" as taught by Professor Amy Edmondson.
Her poster is here but you need an image-processing program to zoom in enough to read the text. ( it's a 4096 x 4096 pixel image! ) This is the expanded version of this tweet:
https://twitter.com/DrTese/status/1221089268740902915
https://pbs.twimg.com/media/EPIuNZ1X0AUOHGe?format=jpg&name=4096x4096
She's on twitter as @DrTese. Her self description is: "Christ follower. Boat Rocker. Nurse researcher/educator exploring resilience as a means to fight moral distress and burnout. Holocaust educator. Views are mine."
=======================================
Before going on let me put in a plug for the opposite end of the spectrum -- innovation and entrepreneurship and solution of uncharted waters by use of virtue-driven consultation and listening to every voice at every level:
Spiritual Solutions to Economic Problems - through Baha'i consultation
https://newbricks.blogspot.com/2014/05/spiritual-solutions-to-economic.html
I think religious and spiritual factors are extremely relevant because it is precisely a strong fellowship and membership in an organization which upholds very high standards and is polarizing about it that gives a person the strength to stand up in strong headwinds and insist on having a voice.
Examples would be nursing professional organizations, church or other worship communities, and the US Army. It takes a village to be a strong individual, especially if it involves going toe-to-toe with a social culture that opposes virtue, or finds virtue in an assumed culture that accepts, winks at, or even worships selfishness, greed, etc.
===================================
And one more totally surprising fact - the most top-down mission-critical organization in the world probably, the US Army, uses after-action debriefings in order to hear the voices of the lowliest private as to what is working and what is not. So if the US ARMY is capable of retaining authority at the top and still listening to their own staff, why can't every organization?
US Army Leadership Field Manual FM 22-100
===================================
I'm an IT systems analyst ( retired ) and worked 15 years at the University of Michigan 890 bed teaching hospital in Ann Arbor where I observed first hand many ways the nurses were left out of the conversation about what sort of Electronic Health Record or Physician Order Entry system the hospital should have.
Around the globe, it is extremely common for large-scale IT project to fail, and Electronic Health Record System implementations are high on the list of disasters caused by failure of higher ups to listen to the staff. This sort of thing is common across industries, not just in health care. Arrogance of management is wide spread.
Part of the problem is that we have confounded "authority", meaning one who is an expert, with "authority", meaning one who is in charge. In the 1800's this would have been the same person, but today the world is vastly more complicated and no person can possibly be an expert in everything, but the powerful narrative that 'the boss' is supposed to be 'the expert' massively distorts necessary conversations as dissent and well-deserve advice is taken as hostile enemy action.
The Toyota Way describes how Honda and Toyota managed to rise from war devastated Japan to overtake the Big Three auto makers in the USA and a key part of that process was a culture of massive transparency and efforts to expose, and then fix, errors and waste, on every level, including management. There was no taboo on a "360-degree assessment" actually looking up for sources of problems in operations as well as the normal looking around and down -- that's why it worked and that's why it was so unpopular in America.
As a private pilot myself I also had been following the literature on aviation safety as well as Amy Edmondson's work on "Psychological Safety". I did my MPH work at Johns Hopkins and took the course in Patient Safety from Dr. Peter Pronovost which covered many of these topics in depth. I also ended up working the Office of Clinical Affairs, a decade ago now, harvesting data on medication administration and preparing the monthly internal reports as well as the federally mandated reports on same.
Here's an index into some of the post I wrote in the past on safety, errors, system-level errors, etc. I actually wrote several hundred and these are the most important and immediately relevant ones.
A systems-level analysis of why large organizations develop blind spots
The road to error - illustrated
The Crash of Comair 5191 in Lexington KY -- taking off on the wrong runway
The crash of ComAir 5191 - a multi-lelvel systems analysis of how things go so wrong
Comair 5191 - Confirmation Bias and Framing (1/20/07)
Cockpit voice recorder transcripts
Washington DC Crash of Air Florida was 25 years ago - remembered
(with links to BMJ, High-reliability engineering, TEM, etc.)
On the Deepwater Horizons oil-spill into the Gulf of Mexico
https://newbricks.blogspot.com/2020/02/deepwater-horizon-lessons-we-still.html
https://newbricks.blogspot.com/2010/05/oil-next-time.html
https://newbricks.blogspot.com/2010/11/gulf-spill-report-reveals-lack-of.html
Hypnotized in high places - Northwest Flight 188
https://newbricks.blogspot.com/2009/10/hypnotic-trance-in-high-places.html
Things we have to believe to see
Why men don't ask for directions
Pisa/OECD - Why our education stresses the wrong way of seeing
Failure is perhaps our most taboo subject (link to John Gall Systemantics)
Active strength through emergent synthesis
US - Economy of arrogance (and blindness)
Virtue drives the bottom line - secrets of high-reliability systemsHigh-Relability Organizations and asking for help
Secrets of High-Reliability Organizations (in depth, academic paper)
High-Reliability.org web site
Threat and Error Management - aviation and hospital safety
Failure is perhaps our most taboo subject (link to John Gall Systemantics)
The importance of social relationships.
Houston - we have another problem (on complexity and limits of one person's mind)
Institute of Medicine - Crossing the Quality Chasm and microsystems (small group teamwork)
High-Relability Organizations and asking for help(my thoughts)
Secrets of High-Reliability Organizations (in depth, academic paper, MIT)
High-Reliability.org web site
Threat and Error Management - aviation and hospital safety - Texas
Institute of Medicine - Crossing the Quality Chasm and microsystems (small group teamwork)
Nineteen
case studies of health care organizations that dramatically improved
their operations through the use of feedback-regulated small-team
("microsystems") operations are well documented in another post.
No comments:
Post a Comment