Entry tags:
The Logic of Failure
Dietrich Dörner - The Logic of Failure
(This is a repost from Livejournal)
Dietrich Dörner is a German cognitive researcher. In this book he tries to find out why things go wrong - and what people do afterwards. Among other things, he describes results of various simulations where the test subjects were to try to administer couple of communities run as computer simulations, including two hypothetical African tribes and a small town with a watch factory. Many of them did not do very well. The reactions to their failure were the usual;
* Blaming the outside forces, including the environment
* Accusations of unfairness - that the scenario was sabotaged from the start
* Withdrawal behind ideological arguments or euphemisms.
* Telling that the (simulated) people must make sacrifices (when their leader's plans went wrong).
* Concentrating in one thing in exclusion of everything else (like a favorite irrigation project regardless of everything else that happens)
* Repair service behavior - only fix thins when they go wrong, instead of finding why they do so, so the problem could be avoided in the future.
* Claiming that they said that the idea would not work, even if it came from them (they might actually forget where the idea came from).
Sound familiar?
Politicians tend to think in terms of ideological goals (and if they stop doing that they might lose their supporters). Some people concentrate on things they know best - which is not good when they would have to do large-scale decisions. Leaders push their political ideals and/or pet theories and disregard any reports of failure. Too much information can become a liability because even if you should take everything into account, you really can't. Are you delegating anything?
Dörner does not give "correct answers", probably because they do not really exist. One-real-life case Dörner mentions is Chernobyl; the accident happened when the operators were following a test procedure as usual and didn't take account of the possibility of meltdown; They were too confident they could handle everything. And you probably know the result…
(This is a repost from Livejournal)
Dietrich Dörner is a German cognitive researcher. In this book he tries to find out why things go wrong - and what people do afterwards. Among other things, he describes results of various simulations where the test subjects were to try to administer couple of communities run as computer simulations, including two hypothetical African tribes and a small town with a watch factory. Many of them did not do very well. The reactions to their failure were the usual;
* Blaming the outside forces, including the environment
* Accusations of unfairness - that the scenario was sabotaged from the start
* Withdrawal behind ideological arguments or euphemisms.
* Telling that the (simulated) people must make sacrifices (when their leader's plans went wrong).
* Concentrating in one thing in exclusion of everything else (like a favorite irrigation project regardless of everything else that happens)
* Repair service behavior - only fix thins when they go wrong, instead of finding why they do so, so the problem could be avoided in the future.
* Claiming that they said that the idea would not work, even if it came from them (they might actually forget where the idea came from).
Sound familiar?
Politicians tend to think in terms of ideological goals (and if they stop doing that they might lose their supporters). Some people concentrate on things they know best - which is not good when they would have to do large-scale decisions. Leaders push their political ideals and/or pet theories and disregard any reports of failure. Too much information can become a liability because even if you should take everything into account, you really can't. Are you delegating anything?
Dörner does not give "correct answers", probably because they do not really exist. One-real-life case Dörner mentions is Chernobyl; the accident happened when the operators were following a test procedure as usual and didn't take account of the possibility of meltdown; They were too confident they could handle everything. And you probably know the result…