The Empire repeats

August 5, 2008

My original blog The Emperor’s Chess Board was republished on the ActKM forum and its been interesting to observe some of the exchanges, although I am staying out of them for the moment as part of a self-imposed break from listservs which may become permanent. In general they don’t seem to have grasped the idea that I was challenging the concept of centralisation of a government knowledge function (I will qualify this a bit in a future post), arguing that it would manifestly lead to failure to achieve the key goals of making a nation more secure. So far the following themes seem to have emerged:

  • The idea that this time we will get it right. The argument here is that knowledge management has learnt from its mistakes and a centralised function would not focus on information management solutions alone. This has also introduced the concept of of speaking truth to power, which seeks to directly address my earlier point about about the difficulty of joining up the dots. The argument here is that in the cases I quoted (the shuttle disasters for example) people did know what was happening but were overruled by political considerations.
  • The scary idea that any approach needs clear definitions of knowledge management, and criterial by which various claims can be validated. This of course means adopting a particular philosophical approach (in this case a variation of Popper) and requires (i) a degree of intellectual vigor in government that is not likely any time soon and (ii) fails to appreciate the messy bottom up approach implied by taking a complexity science approach to the problem.

Now I don’t intend to devote anytime to the second theme, I think it defeats itself by its nature. However the first this time we will get it right approach is interesting if only for the illustrations it gives of linear thinking which so prevail in this space. So in this third post in the series I want to address that, in effect an extension of my earlier two posts before moving to some hopefully pragmatic suggestions on the way forward.

Now it is undoubtably true that frequently people who need to be heard are not heard. With the benefit of hindsight those who got it right suddenly become seen as prescient martyrs whose advise should have been taken. On the face of it this seems to make sense, and if there advise was rejected that its easy to find a source to blame, politicians or whatever. We like to believe that there is a reason for things, its called fundamental attribution error but following it is rather like the Emperor who was conned out his rice crop, following what may seem to be common sense can cost you a lot, in this case probably detecting the pre-conditions of the next terrorist attack.

One of the many problems is there sheer issue of numbers again. In a very large and distributed workforce the number of people making educated guesses or hunches is very high. We tend to underestimate probabilities. Once the event has happened we pay attention to those whose predictions proved correct, we don’t pay attention to the number that were proved incorrect. One famous con trick involved someone sending out a thousand emails and predicting that a stock would go up to half, and down to the other half. When it went up a second email went out to 500 to say “I got it right” then a prediction on another stock to go up went to 250, down to the other 250. The process continued until a small group of people thought the con-man was infallible at which point they were stung big time. In any event, after the event some will have been proved right, statistics tells us so. It does not mean either that there were right for the right reasons, or that they will be right again, or for that matter that they should have been paid attention to.

We did an experiment some years ago, working with the 911 report to get people to understand complexity concepts of attractors. the morning was a complete failure. With the benefit of hindsight they could all see what should have been done, they could not describe it in terms of uncertainty. Over lunch I was depressed. But then a conversation started about a current situation where the outcome was uncertain and what signals should have attention paid to them unclear. All of a sudden in the context of uncertainty, methods designed for uncertainty worked.

What you have got here, and it is VERY VERY dangerous, is using hindsight as a substitute for foresight. We need methods and tools designed for that and they do not include centralised KM functions for government, but rather something more complex. More on that when I have thought up the next variation of Empire for the heading.

Recent Posts

About the Cynefin Company

The Cynefin Company (formerly known as Cognitive Edge) was founded in 2005 by Dave Snowden. We believe in praxis and focus on building methods, tools and capability that apply the wisdom from Complex Adaptive Systems theory and other scientific disciplines in social systems. We are the world leader in developing management approaches (in society, government and industry) that empower organisations to absorb uncertainty, detect weak signals to enable sense-making in complex systems, act on the rich data, create resilience and, ultimately, thrive in a complex world.
ABOUT USSUBSCRIBE TO NEWSLETTER

Cognitive Edge Ltd. & Cognitive Edge Pte. trading as The Cynefin Company and The Cynefin Centre.

© COPYRIGHT 2024

< Prev

Between Empires, a reflection

I'll get back to the part II of The Emperor's Chessboard tomorrow. At the moment ...

More posts

Next >

The Emperor’s Eunuchs

I've missed blogging for a couple of days, in the main because I want to ...

More posts

linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram