How do we measure experts?

September 10, 2007

Some of the comments on my first blog raised questions about expertise. Patrick Lambe wondered if it was possible for people in the KM community to develop expertise. I responded in a comment to Patrick but I think it may be useful to examine what criteria we use to assess experts.

Also, Wayne Zambergen asked directly What is expertise?

I believe we can reduce the confusion by matching expertise to the complexity of the domain. We can’t expect the level of mastery from a weather forecaster that we do from a chess grandmaster. And we can expect even less from a Knowledge Management specialist.

According to Bloisot and Snowden, one way to describe domains is to see if they are ordered, complex or chaotic.

  • An ordered domain has high structure and high stability. Cause-effect relationships are directly perceived. Knowledge can be represented using abstract, symbolic language.
  • .A complex domain is less structured and stable, and knowledge is often communicated through narrative. Cause-effect relationships have to be worked out through sensemaking. .Chaotic domains are minimally structured and stable, and we can see no clear cause-effect relationships. Expertise here tends to be intuitive and difficult to communicate to others.

The weather system is a complex domain. Chess is an ordered domain. We expect different levels of expertise from forecasters and grandmasters. Stock markets are complex and perhaps chaotic. The world facing knowledge managers is more complex than the world facing weather forecasters. Therefore, the level of expertise and kind of expertise will be different than we would find in chess or weather forecasting.

Compared to these other domains, for knowledge managers the ease of forming and discriminating categories will be much lower, and the number of categories proposed to make sense of a phenomenon will be much higher (less abstract). I invite readers to study Patrick’s wonderful book “Organising knowledge: Taxonomies, knowledge and organisational effectiveness.”

My assertion is that we should match the expected level and kind of expertise to the nature of the domain. Yes, it does make sense to talk about expert knowledge managers, but what we would expect from an expert KM would be different from what we would expect from a chess grandmaster.

Factors that will determine the level of expertise that can be achieved in a domain:

  • Complexity of the domain (discussed above). .
  • Experiences a person can have. (the number of repetitions and the variability of the experiences. .The nature of the feedback (speed, clarity, process vs outcome feedback). Knowledge Mangers don’t get a lot of feedback and what they do get is overlaid with all kinds of other intervening events that force them to struggle to make sense of cause-effect relationships. .Access to Subject-Matter Experts (To observe them, to query them and get evaluations from them) .Self-direction. The ability to take responsibility for improving skills, through directed practice and other strategies.

We should also consider the kinds of knowledge experts would have. This is a fairly standard list we have all been using over the years. All except the first involve tacit knowledge.

  • Declarative knowledge
  • Routines – range and sophistication of routines that can be applied
  • Perceptual discriminations
  • Patterns and prototypes
  • Number and variety of episodes in an experience bank. These allow judgments of typicality, and detection of anomalies.
  • Mental models

How to evaluate experts? There are several kinds of criteria we can use.

  • SMEs vs journeymen vs laymen. A comparative standard.
  • Comparison of SMEs to optimal performance. A normative standard.
  • Comparison of SMEs to mechanical devices (e.g., comparing stock brokers to the performance of the S&P 500). An analytical standard.
  • Nomination by peers and supervisors. A social standard.
  • Success on key decisions. An outcome standard.
  • rganizational contribution. An organizational standard.

For KMs, the relevant criteria are the comparative standard, the social standard, and the organizational standard. The normative and analytical standards don’t seem to apply. The outcome standard is too flimsy to be reliable but might be used if you can gather reasonable data on the KM systems developed.

The organizational contribution may be very important for KMs. In a complex/chaotic world we can’t expect the KMs to predict the future very accurately. But they can raise issues and identify discriminations that can help policymakers make decisions. They can warn people from applying practices that have worked in the past in similar projects but would fail because that there is a requirement here that is markedly different from the analog cases.

What we could expect from expert KMs? Here are some potential metrics.

  • Sensemaking metrics such as the accuracy of the predictions they make about users’ preferences and practices, sophistication of their explanations for user behaviors, speed and accuracy of spotting anomalies. .
  • Data gathering strategies – their power and efficiency. .Range and sophistication of mental models. .Ability to make important discriminations, see cause-effect links, discount spurious cause-effect links. .Declarative knowledge. Just the collection of facts they need to know.

Finally, how can KMs develop expertise?

  • Unsupervised instruction. .
  • Trial and error. Reinforcement/feedback opportunities. .Observation of others, review of previous cases. .Supervised instruction. Classwork. .Semi-supervised instruction, such as On-the-Job Training (OJT) and informal coaching.

Leave a Reply

Your email address will not be published. Required fields are marked *

Recent Posts

About the Cynefin Company

The Cynefin Company (formerly known as Cognitive Edge) was founded in 2005 by Dave Snowden. We believe in praxis and focus on building methods, tools and capability that apply the wisdom from Complex Adaptive Systems theory and other scientific disciplines in social systems. We are the world leader in developing management approaches (in society, government and industry) that empower organisations to absorb uncertainty, detect weak signals to enable sense-making in complex systems, act on the rich data, create resilience and, ultimately, thrive in a complex world.
ABOUT USSUBSCRIBE TO NEWSLETTER

Cognitive Edge Ltd. & Cognitive Edge Pte. trading as The Cynefin Company and The Cynefin Centre.

© COPYRIGHT 2024

< Prev

Anomie

I was attempting to explain Durkeim's concept of Anomie (literally without law) to my daughter ...

More posts

Next >

Going down to the sea

Sunday evening Huw and I were walking around the harbour at Caen waiting for embarkation ...

More posts

linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram