I taught the first day of of the Cynefin and Sense-making course here in Zurich without slides. I’m breaking the pattern entrainment of the slides I used and modified for the last two years by finding different ways to explain and expand on some of the consequences of thinking from a complexity perspective. One of the aspects of speaking in this way is that you discover new ways of explaining things, but also you get new insights and understanding. In effect the law of unintended consequences come into play if you relax constraints.
Now unintended consequences are very important for complexity thinking. Chandler in his excellent book on Resilience makes the point that in a complex world governments have to be take responsibility for the unintended consequences of their actions and that understanding this is key to the future. In a complex adaptive system we have dispositionality not causality, so when we act we cannot know what will happen. That means we need to maintain the ability to respond quickly, something I brought in at the end when I talked about complexity approaches to project management.
Now we all know about the negative unintended consequences. The picture is of a cane toad, introduced into Australia to control pests which is now a pest in itself. Rhododendrons in Snowdonia, Rabbits in New Zealand and many other cases come to mind. But there are also good ones, the accidental insights (some of which are examples of exaptation) are key to progress. So managing complexity is about a reflective awareness of, and acceptance of, the unintended consequences of our acts both individually and collectively. We can’t say we didn’t intend X, or that we could not have forecast Y when we don’t like the results. In a complex world we have to accept that these things are a consequence of acting in the system.
So the simple fact that 82% of prisoners in the US are high school drop outs can be laid in part a the door of those responsible for education policy. Recent quantitative easing in the US has caused a surge in the fine art market as well. I could go on, there are lots and lots of examples It is why in a complex world we do multiple parallel safe to fail interventions so that we can be in a position where we are sensitised from the start to the very simple fact that no plan survives its first encounters with reality, to adapt Clausewitz.
The fact that we know this is not an accident per se, but an inherent aspect of the system has profound implications for strategy, governance and also ethics. If it’s a complex adaptive system then you know something will happen you just don’t know what. So you are responsible not only for it, whatever it is and you are responsibly for building a system which has the resilience to cope. Engineering approaches based on efficiency not only don’t work, they represent an approach which in a very deep sense is unethical at best, evil at the worst.