By Melanie Tannenbaum
Fifteen years ago, psychologists showed that the most incompetent people are the worst at recognising their own incompetence, confirming what most of us already suspected. Now it turns out that even highly competent people may lay claim to more knowledge than they actually possess.
In a set of studies published recently in Psychological Science, Stav Atir, Emily Rosenzweig and David Dunning showed that people who view themselves as experts in any given domain may actually be more susceptible to over-claiming how much they know about it – more so than non-experts, who are more willing to acknowledge that they know less.
In one experiment, people were first asked to rate their financial literacy, and then were asked whether they recognised various terms, including “fixed-rate deduction,” “annualised credit,” or “pre-rated stocks”. The catch: the researchers invented those terms just for the study. The higher participants rated their financial literacy, the more likely they were to claim to recognise the terms.
In another experiment, participants took one of two versions of a quiz on North American geography. Those that took the easier version were goaded into feeling they must know a lot about the subject, while those who took the more difficult version were led to conclude they must know very little. Then both groups were shown a series of place names within North America. Some were real (Philadelphia, Acadia National Park), while others were fictitious (Cashmere, Oregon; Lake Othello, Wisconsin). The group that took the easy version of the quiz were far more likely to claim to be familiar with the fake place names.
When people think they know more than others, they are generally correct. Objective measures show a strong correlation between the degree to which people think they know a lot about a given subject, and the amount they actually know. But self-rated experts are still more likely to claim knowledge of fictitious terms than their more modest counterparts.
You may recognise the name of David Dunning among the authors above. Back in 1999, he and Justin Kruger conducted a pioneering series of studies showing that the least-capable people are the ones who dramatically over-estimate their own abilities, while the mostly highly skilled people tend to underestimate their performance. It’s now known as the Dunning-Kruger effect.
On the surface, this latest study seems to contradict the 1999 study. If highly skilled people underestimate their performance, how can they also over claim their level of knowledge?
“I think the difference stems from how we measure overconfidence, and when,” lead author Stav Atir explained to Gizmodo. “In much of the previous work on overconfidence, participants are asked to evaluate or judge their performance after the act. [In this newer research] we are asking them to predict their performance, in a sense. Possibly, experts know when they’ve made a mistake, but they aren’t very good at predicting it.”
Essentially, asking people to predict their performance, as opposed to asking them to gauge how well they’ve done when a task is completed, are two different questions. Each question taps into a different skill-set, and each presents different obstacles to overcome. For instance, Atir and her co-authors suggest that people with higher levels of knowledge might over claim familiarity with certain terms because they sound vaguely familiar and are thus plausible.
Case in point: in the latest study, the researchers made up three biological terms: “meta-toxins,” “bio-sexual,” and “retroflex”. The more I know about biology, the more likely I’d be to project my existing knowledge onto them. I know that a toxin is any poisonous substance produced by a living cell or organism. “Meta-” is also a familiar prefix in science. So even though I’ve never heard the term “meta-toxin” before, I might fall right into the trap of feeling knowledgeable about it. I could even use my existing knowledge to come up with a reasonable definition for it, even though it doesn’t exist.
According to Atir and her colleagues, this general “feeling of knowing” is responsible for the pattern of experts over-claiming. It’s what’s known as an heuristic, a decision-making rule of thumb that helps us navigate the world quickly and effectively. In the present case, if you feel like you know a lot about something, you will use that feeling as a general guide to make inferences about what you should know or probably know. The more expertise you have, the more existing knowledge you have to draw on, and the more likely you will be to (inaccurately) ping fictitious terms as something you’ve heard before.
The original Dunning-Kruger effect, on contrast, didn’t rely on people being led astray by a subjective experience of things feeling “familiar”. It showed up when participants were asked to rate their performance after a task had been completed. It relies on the fact that the lack of knowledge exhibited by those scoring in the bottom percentile would render them ill-equipped to accurately evaluate their own performance.
In fact, in those original studies, giving the poor performers a training packet to help them learn how to grade themselves made the effect disappear. As Dunning and Kruger said themselves, their landmark effect emerged because “not only do [poor performers] reach mistaken conclusions and make regrettable errors, their incompetence robs them of the ability to realise it”.
In the end, whether we’re at the bottom of the knowledge totem pole or a true expert, we’re all susceptible to overestimating our abilities in different circumstances.
Ativ, Stav; Rosenzweig, Emily; and Dunning, David. (2015) “When knowledge knows no bounds: self-perceived expertise predicts claims of impossible knowledge,” Psychological Science 26(8): 1295-1303. [Published online July 14, 2015]
Kruger, Justin, and Dunning, David. (1999) “Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-assessments,” Journal of Personality and Social Psychology 77(6): 1121-1134.
Image: The Princess Bride (1987), via PandaWhale.