Do you really “know” anything?

The problem with ultracrepidarianism, and what we can do about it

Tania
4 min readMar 31, 2021

As humans, we like to gossip. We talk and form opinions about things we do not truly understand (how many times did you have or hear a conversation about dietetics, nutrition, Bitcoin?). There is a word for that: ultracrepidarianism. Why do we overestimate our knowledge? Are “experts” real? How to overcome those biases?

🏆 We overestimate our understanding of things

People have strong opinions about nuclear energy, GMOs, and Obamacare. But could you explain nuclear fission, stem cells, the implications of Obamacare? The truth is, even Obama doesn’t truly understand Obamacare; it’s too long and too complicated. The worse thing is, we don’t know about our own ignorance. Three phenomena can explain this:

  • According to Dunning and Kruger, who gave their name one of the most famous cognitive biases (the Dunning-Kruger effect), in order to realize your own ignorance, paradoxically, it would require you to have the level of competence of an expert. This means that anybody who isn’t a specialist probably thinks they’re way more knowledgeable than they actually are…
  • A corresponding bias is the Illusion of Explanatory Depth. In a study, people were asked how much they understood everyday things, such as a bike. Then they tried to explain them. It turned out, we largely overestimate our knowledge — confidence dropped drastically when people were confronted with their own ignorance.
  • Another phenomenon is Contagious Understanding, described by S.Sloman. In one set of experiments, people read about a made-up phenomenon (such as glowing rocks). When asked to rate their own understanding, those who were told that scientists can explain it gave higher ratings than those who were told it was mysterious. Just the fact that it was possible to be understood, means they already did… We sometimes fail to tell apart what knowledge resides in our heads, and what resides elsewhere (in other people’s heads).
Which one is a real bike? It turns out, even regular cyclists couldn’t figure out that none of those models are correctly drawn.

👼 We fall for the Halo Effect

According to E.Klein, a french philosopher, we like to hear from people who talk a lot and about lots of things (politicians, influencers, basically every celebrity and public figure). We are reassured by their halo and their confidence (sometimes arrogance). Typically, we’d place as much trust in a political figure who tweets “I am not a Doctor, but [insert strong opinion]”, as to a moderate expert. As a society, we seem to be OK with people who practice ultracrepidarianism…

In Thinking Fast & Slow, D.Kahneman explains that because of mental shortcuts, people will pay too much attention to the information itself, and not enough to the reliability of the source. Today, information of different kinds (scientific, personal opinions, fake news…) circulates on the same channels, which contributes to “contamination” of the status of information. Adding that to the source misattribution bias — where people forget or misattribute information they recall — it’s easy to understand why unverified information (fake news) is so prevalent.

👨‍👨‍👦 For lack of individual expertise, humans rely on each other’s capabilities and knowledge

We saw previously that we can mistake our own knowledge for others’, but fortunately, it is not a systematic error. In an experiment, couples were asked to remember random facts, such as “Kpro2 is a personal computer”; people will tend to take away more facts when they believe their partners are not experts in the matter. They accounted for each other’s expertise and weaknesses, in an implicit “divide and conquer” strategy!

That’s what we call epistemic knowledge. That is how we function as humans and one reason why we’re successful as a social species — we have dentists, cobblers and farmers, each bringing their own expertise.

This phenomenon also holds true in the world of research (maybe a bit too much); researchers build onto each other’s studies, sometimes citing each other’s work without even reading the whole content.

🙇‍♀️ We should all acknowledge our epistemic dependence

  1. Understand and admit: “I almost certainly understand less than I think about any subject I can think about”. This can lead to having more productive debates. In a study, participants from diverse parties were asked to explain the policies they supported and their implications. When they realized their understanding was more shallow than they thought, they moderated their position. “You can’t take a firm stand on shaky grounds”!
  2. Trust the experts. Being wise is more about finding a balance between personal theories and taking advice; without at least scratching the surface of an issue, you’d fall for anything.
  3. Test the veracity of facts. It’s good practice to check if experts agree on it — if there is a general consensus, especially in Science and Research. A good cue to judge an expert’s trustworthiness is their humility: some are quick-witted and have their answers ready, but having an answer to anything isn’t a good sign!
  4. Ask more questions, even dumb ones!

This article’s main sources are:

  • This article from the MIT technology review about epistemic knowledge
  • This Brut video (in French) featuring Etienne Klein, where he explains that we like to talk about things we don’t know about (ultracrepidarianism)

--

--

Responses (1)