2023 Q1 | Edition 1 | Article 3

A proud peacock

How do you know you know?

Human bias can be a rabbit hole. Start poking around and you’ll soon find it hard to find your way back to confident thought. We are sociologically and biologically unique and we bring the good and the bad of everything we’ve experienced to every choice we make. We don’t want to bring arrogance to decision-making, but at the same time, confidence goes a long way. Once you start to look at the list of biases – optimism bias, confirmation bias, selection bias – and their uglier manifestations in prejudice, it becomes hard to trust our own views.

The Dunning-Kruger Effect

Of all the cognitive biases, one of the most fascinating was the one uncovered by David Dunning and Justin Kruger. They carried out two assessments of volunteers in a study. One test was an objective assessment of performance in some logical reasoning tasks, designed by professional assessment teams. They also asked the participants to carry out a self-assessment quiz, where participants were asked to estimate how well they performed. The self-assessment can be based on a lot of different factors. Since then, the experiment has been repeated with a range of tasks, from business to aviation and medicine.

What the study found was that many people who were unskilled at a particular task demonstrated an over confidence in their ability. In other words, those who were experts rated themselves lower than those who were not. The people who think they know, probably don’t know. And the people who know, don’t realise that they know!

According to Dunning-Kruger theory we are at the peak of our confidence when we are least knowledgeable on a specific task. We enter a crisis of confidence as we start to learn more about the subject, but eventually, growing expertise becomes associated with growing confidence. But it never quite reaches the same levels as it was when we knew nothing!

Is the world full of know-it-alls?

There are many possibly metacognitive explanations of this. It’s quite possible that if you’re a beginner in something, you don’t have a full map of what success looks like. For example, I find it hard to watch dancing competitions and see what they experts can see in a performance – I just liked it. It could just show how hard it is to carry out any self- assessment. It could be optimism bias creeping in. We all like to think things will go well, but a more experienced person might see the pitfalls along the way. It does imply the worrying idea that ignorance on a specific task or topic breeds arrogance – one we might see in some online commentary.

How does this impact on working with statistics? Well, at some point, we all need to develop metacognition. In other words, rather than using data analytics output as a substitute for careful thinking, we need to reflect a little more on developing the ability to see what evidence or insights we lack. Competence includes being able to tell the competent from the incompetent – a controversial idea that simplifies a complex process. The better we understand data analytics, the more questions we ask.

It could be that the results of Dunning-Kruger studies are themselves a statistical artefact, a product of the way the test was designed. The studies on this phenomenon are a contested field and this is nuanced challenge but the Dunning-Kruger effect has stood up well to challenges over the years. One issue could be the ‘Better than average effect’. Most of us are average but few of us believe it. In one of the earliest studies in this area, a team at the University of Nebraska reported that 94% of university faculty considered themselves to be better than average as teachers (Cross, 1977). 68% optimistically placed their teaching in the top 25%! Hopefully they weren’t all sampled from the mathematics department.  

A smiley face drawn on the ground

Does it matter?

Over-confidence and under-confidence in the human psychology sense of the word can be dangerous in some situations. Overconfidence in pilots or emergency medical response teams, for example, may prove fatal. Academic researchers this century have replaced arrogance with caution. A few warning signs that we have taken overconfidence too far might be any one of the following warning signs:

·         Failing to plan thoroughly

·         Taking unnecessary risks

·         Failing to make contingency plans

Sometimes a bit of over-confidence is necessary, though, to move things forward. As data producers and consumers, we need to pick our words carefully when reporting on our work.

References:

Cross, P., 1977, Not can but will college teachers be improved? New Directions for Higher Education, 17, 1-15

Dunning, D., 2011, The Dunning-Kruger Effect: One being ignorant of one’s own ignorance, Advances in experimental social psychology, 2011, Elsevier

For more examples of the Dunning Kruger Effect, see:

https://www.britannica.com/science/Dunning-Kruger-effect

New Trends, Same Humans