The first picture shows rising and falling output. This is often what people mean or think of when they talk about "positive" and "negative" feedback.
Unfortunately, it's also their concept of where the "feedback" concept stops, so they missed all the good stuff.
The next picture shows converging output as a result of a simple control ("goal seeking") feedback loop.
The output rises or falls to some present value or "goal".
Then, the system can be "tweaked" a little so it converges faster on the goal, but that often will result in overshooting and coming back with a little bit (or a lot) of bouncing.
The next picture, of the car getting to a hill from the flatland below, is supposed to show how a speed control system should do a good job of maintaining the same speed, even when the outside world changes a lot.
Then the picture of the car going up and down the mounntain explains more about that. Without speed "control", the car would slow down going up the hill, and speed up a lot going down the hill. Instead, the speed is almost constant.
But, this whole effect of locking down or "latching" or "clamping" a value, such as speed, to some predetermined value is really confusing to statistical analysis. The effect is that a variation that is expected to be there is not there. There's no trace of it. So far as statistical analysis shows, there is absolutely no relationship between the slope of the hill and the speed of the car. Well, that's true and false. The speed may not be changing, but the speed of the engine has changed a lot.
The same kind of effect could be seen in an anti-smoking campaign. The level of smoking in a region is constant, and then you spend $10,000 to try to reduce smoking. The tobacco companies notice a slight drop and counter by spending $200,000 to increase advertising. The net result is zero change in the smoking rate. Did your intervention have no effect? Well, yes and no.
The output (cigarette sales) has been "clamped" to a set value by a feedback control loop, so it varies much less than you'd expect. Again, this is hard to "see" with statistics that assume there is no feedback loop involved in the process.
For that matter, the fact that the "usual" statistical tests should ONLY be used if there is no feedback loop is often either unknown or dismissed casually, when it's the most important fact on the table.
(The "General Linear Model" only gives you reliable results if the world is, well, "linear" -- and feedback loop relationships are NEVER linear, unless they're FLAT, which also confuses the statistical tests, and sometimes the statisticians or policy makers.
The good news is that there is a transformation of the data that makes it go back to "linear" again, which involves "Laplace Transforms", which I'm not going to get into today. But, stay tuned, we can make this circular world "linear" again so it can be analylzed and you guys can compute your "p-values" and statistical tests of significance and hypothesis testing, etc.)
OK, then, I illustrate INSTABILITY
caused by a "control loop" . In this case, a new driver with a poor set of rules thinks ("If slow, hit the gas. If fast, hit the brake pedal."). Those result in a very jerky ride alternating between going too fast and too slow.
Note, however, that the CAR is not broken. The Pedals are not broken. The only problem is that the mental rules used to transform the news about the speed into pedal action are a poor choice of rules - in this case, they have no "look ahead" built into them.
Then I have a really noisy picture that's really three pictures in one.
The left top side has a red line showing how some variable, say position of a ship in a river, varies over time. The ship stays mostly mid-stream until the boss decides to "help". Say the boss is up in the fog, and needs to get news from the deckhands, who can actually see the river and the river banks.
Unfortunately, the boss gets position reports by a runner, who takes 5 minutes to get up to the cabin.
As a result, using perfectly good RULES, the captain sees that the ship is heading too far to the right. (well, yes, that's PORT or STARBOARD or some nautical term. For now, call it "right").
So, she uses a good rule - if the ship is heading too far to the right, turn it more to the LEFT, and issues that command.
The problem is that the crew had already adjusted for the too much to the right problem, but too recently for the captain to know about, given the 5 minute delay. So, the captain tells them to turn even MORE to the left, which only makes the problem worse.
The resulting control loop has become unstable, and the ship will crash onto one or the other shores - not because any person is doing the wrong thing, but because the wrongness is extremely subtle. There is a LAG TIME between where the ship WAS and where the captain thinks it is NOW, based on her "dashboard".
That "little" change makes a stable system suddenly become unstable and deadly.
People who are familiar with the ways of control systems will be on the lookout for such effects, and take steps to counteract them. People who skipped this lesson are more likely to drive the ship onto the rocks, while complaining about baffling incompetency, either above or below their own level in the organization.
The last picture shows some of the things that "control system engineers" think about.
These are terms such as "rise time", "overshoot", "settling time", and "stability". And Cost.
These terms deal with how the system will respond to an external change, if one happened.
But a lot of the effort and tools are dedicated to being sure that the system, as built, will be STABLE, and won't cause reasonable components, doing reasonable things, to crash into something.
This kind of stability is a "system variable" in a very real sense that is lost when any heap of parts that interact is called "a system." It is something that has a very real physical meaning It is something that can be measured, directly or indirectly. It is something that can be managed and controlled, by very small changes such as reducing lag times for data to get from person A to person B.
And, my whole point, is that this is something people analyzing and designing organizational behavior and public health regulatory interventions should understand and use on a daily basis.
Maybe we need a simulator, or game, that is fun to play and gets people into situations where they have to understand these concepts, on a gut level, in order to "win" the game.
These are not "alien" concepts. Most of our lives we are in one or another kind of feedback control loop, and we have LOTS of experience with what goes right and wrong in them -- we just haven't categorized it into these buckets and recognized what's going on yet.
One thing I will confidently assert, is that once you understand what a feedback control loop looks like, and how to spot them, your eyes will open and the entire world around you will be transformed. Suddenly, you'll be surrounded by feedback loops that weren't there before.
The difficulty in seeing them may be due to the fact that what is flowing around this loop is "control information", and it can ride on any carrier, as I showed yesterday with the person getting a glass of water. The information can travel in liquids, solids, nerve cells, telephone wires, the internet, light rays, etc., and is pretty indifferent as to what it hitches a ride on.
The instruments keep changing, but the song is what matters.
You have to stop focusing on the instruments and listen to the song.Control System Engineering is about the songs that everything around us is singing. Once we learn to hear them, they're everywhere. Life at every level is dense with them. And, they seem to be a little bit aware of each other, because sometimes they get into echos and harmonies across levels and seem to entrain each other.
It's beautiful to behold. I recommend it!
W.
3 comments:
Feedback loops result in "small" things becoming "big" things.
A really good example is a tornado, because it not only has this effect, but it literally goes around and closes on itself. It's about as non-linear as you can get.
The very small effects of water condensing relasing a little heat, and the coriolis effect of the Earth's rotation on the angular momentum of moving air are usually negligible - until they get into a reinforcing feedback loop.
Then, suddenly, exactly the same tiny effects that were negligible become dominant.
This is one of the "songs" that control systems "sing", and it is completely general, and can happen in any substance or organization or carrier of control information, at any scale.
People who normally interact in random ways may abruptly converge on some movement and suddenly be swept up in it and do amazing things, then it all dissipates and no one quite knows what just happened. Same song.
I also tried to create some curricular tools to quickly demonstrate systems dynamics concepts. Instead of using the cartoon metaphor, as you did in these illustrations, I tried to create a process that went from concept mapping to systems dynamics.
Although one of the slides has the incorrect title, and flickr seems to not order the slides correctly in the set, you can gander at my attempt here: Systems Dynamics for Practitioners of Dialogue.
You also might consider cross posting your illustrations to the Visual Thinking Art blog or flickr pool.
Good job! I like it!
Post a Comment