Sunday, August 12, 2007

Credit crunch reaches larger

More "unintended consequences" from the credit market leveraging everything against everything, trying to make infinite profit on zero assets. (a.k.a. "house of cards"):

Was this visible coming? (from the Washington Post discussion with columnist Steven Pearlstein 8/12/07):

Renfrew, Pa.: Steve, Good timing for a Q&A. One question, when you refer in your article to how "we are learning several painful truths about the new global financial system," would you say this is just another situation although the damage is done, there were plenty of sensible economists (like yourself) who knew exactly the danger and tried to alert a dumbed-down administration and its thoroughly braincell-challenged electorate about it? Kinda like Iraq, 9-11 and Hurricane Katrina?

Steven Pearlstein: I'm not an economist, by the way, but I have been warning about this for some time. The response of policy makers was, yes, that's a risk, but we don't see any sign of it. You have to wonder if they need to get their eyes checked.

Scope of the problem:

Tight Credit could stall some buyout booms:
Washington Post

he severe turmoil in the credit markets last week has raised serious questions about the future of the buyout craze that gave rise to the biggest deals in U.S. corporate history.

For the past few years, a group of elite Wall Street players have been buying up major American icons and taking them private. These massive acquisitions have depended on access to cheap credit, which is supplied by a complex relationship between investment banks and hedge funds.

But with credit markets tightening, the pace of these deals, at least in the short run, is expected to dramatically slow. Already-announced multibillion-dollar buyouts, like Tribune Co., Sallie Mae and Hilton Hotels, are likely to be far more complicated to close, analysts said.

If one or two of these big deals were to collapse, it might not send the economy into a downturn. But it would profoundly shake investors' confidence in a financial system already under siege from billions of dollars in losses from home mortgage defaults. That could make it even more difficult for companies and home buyers to get loans.

And, how exactly is this happening? What are the "system effects" where changes over "there" have an impact over "here"?

Steven Pearlstein again:

Steven Pearlstein: Not sure about the difference in the rating system. But the problem really is that when some supposedly sophisticated investors saw a AAA rating for mortgage-based securities, they assumed there was no risk. Indeed, there is very little credit risk, meaning the risk of not getting paid.

But there is liquidity risk which the rating does not deal with -- the risk that, at some times, the market for these securities may dry up and they cannot be sold or priced. If you hold the securities till maturity -- till all the loans are repaid at the end of their term -- then you don't care about liquidity risk.

But if you are a hedge fund or a pension fund or even a bank that has to estimate the market value of that asset every day, or week, or month, or quarter, then you do care.

And if that price temporarily falls below the amount of the loan you used to buy it, then your bank suddenly cares and demands its money back (the dreaded margin call). Then you either have to sell the mortage backed security, if you can, or if not, sell some "good" asset.

And it that sale of the "good" assets that is how this contagion has developed.
Finally, why exactly is it, how is it, that these changes were "unexpected" or "unseen" when so many people saw them coming?

That is the pivotal question that we need to slow down and investigate. Which of the "instruments" on our leaders "dashboards" is broken, and what else could leak through that hole? Apparently making billions of dollars in profit does not make people any smarter or improve their vision -- or, maybe, it makes it worse.

The linkage between things, "systems thinking", does seem to be the part where human intuition fails entirely to grasp the consequences of actions.

Furthermore, increased IQ or education, by itself, doesn't seem to be a protection against such thinking errors, or makes them worse with overconfidence.

Here's a few quotations from MIT Professor John Sterman's textbook "Business Dynamics".

Many advocate the development of systems thinking - the ability to see the world as a complex system, in which we understand that "you can't just do one thing" and that "everything is connected to everything else." (p4)

Such learning is difficult and rare because a variety of structural impediments thwart the feedback processes required for learning to be successful. (p5)

Quoting Lewis Thomas (1974):
When you are confronted by any complex social system, such as an urban center or a hamster, with things about it that you're dissatisfied with and anxious to fix, you cannot just step in and set about fixing things with much hope of helping. This realization is one of the sore discouragements of our century.... You cannot meddle with one part of a complex system from the outside without the almost certain risk of setting off disastrous events that you hadn't counted on in other, remote parts. If you want to fix something you are first obligated to understand ... the whole system ... Intervening is a way of causing trouble.


IN reality there are no side effects, there are just effects.

Unanticipated side effects arise because we too often act as if cause and effect were always closely linked in time and space. (p 11)

Most of us do not appreciate the ubiquity and invisibility of mental models, instead believing naively that our senses reveal the world as it is (p16).

The development of systems thinking is a double-loop learning process in which we replace a reductionist, narrow, short-run static view of the world with a holistic, broad, long-term dynamic view and then redesign our processes and institutions accordingly. (p18)

Quoting Nobel Prize winner Herbert Simon (p26) : The capacity of the human mind for formulating and solving complex problems is very small compared with the size of the problem...

These studies led me to sugest that the observed dysfunction in dynamically complex settings arises from misperceptions of feedback. The mental models people use to guide their decisions are dynamically deficient. As discussed above, people generally adopt an event-based, open-loop view of causality, ignore feedback processes, fail to appreciate time delays between action and response in the reporting of information, ... (p27)

Further the experiments show the mis-perception of feedback are robust to experience, financial incentives, and the presence of market institutions... First our cognitive maps of the causal structure of systems are vastly simplified compared to the complexity of the systems themselves. Second, we are unable to infer correctly the dynamics of all but the simplest causal maps. (p27)

People tend to think in single-strand causal series and had difficulty in systems with side effects and multiple causal pathways (much less feedback loops.) (p28).

A fundamental principle of system dynamics states that the structure of the system gives rise to its behavior. However, people have a strong tendency to ... "blame the person rather than the system". We ... lose sight of how the structure of the system shaped our choices ... [which] diverts our attention from ... points where redesigning the system or governing policy can have a significant, sustained, beneficial effect on performance (Forrester 1969.). p29.

People cannot simulate mentally even the simplest possible feedback system, the first order linear positive feedback loop. (p29). Using more data points or graphing the data did not help, and mathematical training did not improve performance. ([p29). People suffer from overconfidence ... wishful thinking ... and the illusion of control... Memory is distorted by hindsight, the availability and salience of examples, and the desirability of outcomes.

The research convincingly shows that scientists and professionals, not only "ordinary" people, suffer from many of these judgmental biases. (p30). Experiments show the tendency to seek confirmation is robust in the face of training in logic, mathematics, and statistics. (p31).

We avoid publicly testing our hypotheses and beliefs and avoid threatening issues. Above all, defensive behavior involves covering up the defensiveness and making these issues undiscussable, even when all parties are aware they exist. (p32).

Defensive routines often yield group-think where members of a group mutually reinforce their current beliefs, suppress dissent, and seal themselves off from those with different views or possible disconfirming evidence. Defensive routines ensure that the mental models of team members remain ill formed, ambiguous, and hidden. Thus learning by groups can suffer even beyond the impediments to individual learning. (p33).

Virtual worlds are the only practical way to experience catastrophe in advance of the real thing. In an afternoon, one can gain years of simulated experience. (p35).

The use of virtual worlds in managerial tasks, where the simulation compresses into minutes or hours dynamics extending over years or decades, is more recent and less widely adopted. Yet these are precisely the settings where ... the stakes are highest. (p35).

Without the discipline and constraint imposed by the rigorous testing imposed by simulation, it becomes all too easy for mental models to be driven by ideology or unconscious bias. (p37).

System dynamics was designed specifically to overcome these limitations. ... As Wolstenholme (1990) argues, qualitative systems tools should be made widely available so that those with limited mathematical background can benefit from them. (p38).

Most important ... simulation becomes the main, and perhaps the only way you can discover for yourself how complex systems work. (38).

Thus endeth the reading for today.

As John Gall has pointed out so well, "Failure is our most important taboo."

I note that these thoughts of human limitations are what I call "volatile knowledge", in that, regardless how much sense these make to you right now, by next week they will have evaporated from your brain. Our minds do not like to be challenged, and killing the messenger is commonplace. If we look in history books, the largest event of 1918, the massive killer influenza, has almost entirely disappeared or been relegated to a single sentence, as if, oh yes, that year it rained a lot.

Like all beginning Instrument Pilots, most humans have a lot of trouble knowing how they should operate if it is true that their brains and eyes are routinely lying to them about what's going on and why. The resulting method used to resolve that conflict is to carefully erase and forget those inconvenient facts, and go back to trusting our senses.

In reality, both computer-aided simulation where available, and consultation with as wide and diverse a group as possible are the best protection we have against ourselves and our stubborn refusal to admit that we actually can't see very well ourselves, and what we do see is suspect, or should be.

Whether in religion or science, that core humility is the first step towards wisdom.

No comments: