You might also like:
At worst, the ideas themselves are harmful – a recent report from one province in Iran found that
more people had died from drinking industrial-strength alcohol, based on a false claim that it could protect you from Covid-19, than from the virus itself. But even seemingly innocuous ideas could lure you and others into a false sense of security, discouraging you from adhering to government guidelines, and eroding trust in health officials and organisations.
There’s evidence these ideas are sticking. One poll by YouGov and the Economist in March 2020 found
13% of Americans believed the Covid-19 crisis was a hoax, for example, while a whopping 49% believed the epidemic might be man-made. And while you might hope that greater brainpower or education would help us to tell fact from fiction, it is easy to find examples of many educated people falling for this false information.
Just consider the writer Kelly Brogan, a prominent
Covid-19 conspiracy theorist; she has a degree from the Massachusetts Institute of Technology and studied psychiatry at Cornell University. Yet she has shunned clear evidence of the virus’s danger in countries like China and Italy. She even went as far as to question the basic tenets of
germ theory itself while endorsing pseudoscientific ideas.
Fortunately, psychologists are already studying this phenomenon. And what they find might suggest new ways to protect ourselves from lies and help stem the spread of this misinformation and foolish behaviour.
Information overload
Part of the problem arises from the nature of the messages themselves.
We are bombarded with information all day, every day, and we therefore often rely on our intuition to decide whether something is accurate. As BBC Future has described in the past, purveyors of fake news can make their message feel “truthy” through a few simple tricks, which discourages us from applying our critical thinking skills – such as checking the veracity of its source. As the authors of one paper put it: “
When thoughts flow smoothly, people nod along.”
Eryn Newman at Australian National University, for instance, has shown that
the simple presence of an image alongside a statement increases our trust in its accuracy – even if it is only tangentially related to the claim. A generic image of a virus accompanying some claim about a new treatment, say, may offer no proof of the statement itself, but it helps us visualise the general scenario. We take that “processing fluency” as a sign that the claim is true.
For similar reasons, misinformation will include descriptive language or vivid personal stories. It will also feature just enough familiar facts or figures – such as mentioning the name of a recognised medical body – to make the lie within feel convincing, allowing it to tether itself to our previous knowledge.
Even
the simple repetition of a statement – whether the same text, or over multiple messages – can increase the “truthiness” by increasing feelings of familiarity, which we mistake for factual accuracy. So, the more often we see something in our news feed, the more likely we are to think that it’s true – even if we were originally sceptical.
Sharing before thinking
Gordon Pennycook, a leading researcher into the psychology of misinformation at the University of Regina, Canada, asked participants to consider a mixture of true and false headlines about the coronavirus outbreak. When they were specifically asked to judge the accuracy of the statements, the participants said the fake news was true about 25% of time. When they were simply asked whether they would share the headline, however, around 35% said they would pass on the fake news – 10% more.
“It suggests people were sharing material that they could have known was false, if they had thought about it more directly,” Pennycook says. (Like much of the cutting-edge research on Covid-19, this research has not yet been peer-reviewed, but a
pre-print has been uploaded to the Psyarxiv website.)
Perhaps their brains were engaged in wondering whether a statement would get likes and retweets rather than considering its accuracy. “Social media doesn’t incentivise truth,” Pennycook says. “What it incentivises is engagement.”
Or perhaps they thought they could shift responsibility on to others to judge: many people have been sharing false information with a sort of disclaimer at the top, saying something like “I don’t know if this is true, but…”. They may think that if there’s any truth to the information, it could be helpful to friends and followers, and if it isn’t true, it’s harmless – so the impetus is to share it, not realising that sharing causes harm too.
Whether it’s promises of a homemade remedy or claims about some kind of dark government cover-up, the promise of eliciting a strong response in their followers distracts people from the obvious question.
This question should be, of course: is it true?
Override reactions
Classic psychological research shows that some people are naturally better at overriding their reflexive responses than others. This finding may help us understand why some people are more susceptible to fake news than others.
Researchers like Pennycook use a tool called the “cognitive reflection test” or CRT to measure this tendency. To understand how it works, consider the following question:
- Emily’s father has three daughters. The first two are named April and May. What is the third daughter’s name?
Did you answer June? That’s the intuitive answer that many people give – but the correct answer is, of course, Emily.
To come to that solution, you need to pause and override that initial gut response. For this reason, CRT questions are not so much a test of raw intelligence, as a test of someone’s tendency to employ their intelligence by thinking things through in a deliberative, analytical fashion, rather than going with your initial intuitions. The people who don’t do this are often called “cognitive misers” by psychologists, since they may be in possession of substantial mental reserves, but they don’t “spend” them.
When it came to the coronavirus statements, for instance, Pennycook found that people who scored badly on the CRT were less discerning in the statements that they believed and were willing to share.
Testing participants soon after the original YouGov/Economist poll was conducted, he found that people who scored worse on the CRT were significantly more susceptible to these flawed arguments.
These cognitive misers were also less likely to report having changed their behaviour to stop the disease from spreading – such as handwashing and social distancing.
Stop the spread
Knowing that many people – even the intelligent and educated – have these “miserly” tendencies to accept misinformation at face value might help us to stop the spread of misinformation.
Given the work on truthiness – the idea that we “nod along when thoughts flow smoothly” – organisations
attempting to debunk a myth should avoid being overly complex.
Instead, they should present the facts as simply as possible – preferably with aids like images and graphs that make the ideas easier to visualise. As Stanley puts it: “We need more communications and strategy work to target those folks who are not as willing to be reflective and deliberative.” It’s simply not good enough to present a sound argument and hope that it sticks.
If they can, these campaigns should avoid repeating the myths themselves. The repetition makes the idea feel more familiar, which could increase perceptions of truthiness. That’s not always possible, of course. But campaigns can at least try to make the true facts more prominent and more memorable than the myths, so they are more likely to stick in people’s minds. (It is for this reason that I’ve given as little information as possible about the hoax theories in this article.)
When it comes to our own online behaviour, we might try to disengage from the emotion of the content and think a bit more about its factual basis before passing it on. Is it based on hearsay or hard scientific evidence? Can you trace it back to the original source? How does it compare to the existing data? And is the author relying on
the common logical fallacies to make their case?
Pennycook suggests that social media networks could nudge their users to be more discerning with relatively straightforward interventions. In his experiments, he found that asking participants to rate the factual accuracy of a single claim primed participants to start thinking more critically about other statements, so that they were more than twice as discerning about the information they shared.
In practice, it might be as simple as a social media platform providing the occasional automated reminder to think twice before sharing, though careful testing could help the companies to find the most reliable strategy, he says.
There is no panacea. Like our attempts to contain the virus itself, we are going to need a multi-pronged approach to fight the dissemination of dangerous and potentially life-threatening misinformation.
And as the crisis deepens, it will be everyone’s responsibility to stem that spread.
--
David Robson is the author of The Intelligence Trap, which examines why smart people act foolishly and the ways we can all make wiser decisions. He is @d_a_robson on Twitter.
As an award-winning science site, BBC Future is committed to bringing you evidence-based analysis and myth-busting stories around the new coronavirus. You can read more of our Covid-19 coverage here.
--
LINK