Loading

Could this be the cure for fake news?

Share on Linkedin
(Credit: Getty Images)
Most organisations have been fighting fake news when they see it. But what if there were a way to stop misinformation before it even starts to spread?
F

For decades, medicine has provided us with an easy way to prevent diseases: vaccines.

Most of us are familiar with how a vaccine works – it exposes our bodies to weakened versions of a virus to help us build antibodies against the real thing. Now common practice in GPs around the world, vaccination has all but extinguished some of the worst diseases of the last century, including measles and polio.

But could vaccines have applications beyond medicine?

Researchers like Sander van der Linden are working on a type of vaccination that could combat a very 21st-Century scourge: fake news.

This could work because misinformation behaves like a virus. False news stories spread faster, deeper and farther than true stories, cascading from ‘host’ to ‘host’ across Twitter hashtags, WhatsApp groups and your great-uncle Gavin’s Facebook profile (yes, he opened one). To make things worse, a fake story is tenaciously persistent.

Misinformation spreads like a virus, spreading across social media faster and farther than factual stories (Credit: Getty Images)

Misinformation spreads like a virus, spreading across social media faster and farther than factual stories (Credit: Getty Images)

“If you try to debunk it, misinformation sticks with people,” says van der Linden, who leads the Social Decision-Making Laboratory at the University of Cambridge. “Once it’s integrated into the long-term memory, it becomes very difficult to correct it.”

You might also like:
How to avoid falling for fake news
The enduring appeal of conspiracy theories
The hidden signs that can reveal a fake photo

So what can you do? Like Han Solo, you shoot first.

Professionals in this field call the approach “pre-bunking”. Instead of waiting for false information to spread and then laboriously fact-checking and debunking it, researchers go for a pre-emptive strike that has the potential to shield your brain.

Decades of research suggests that this approach works.

Psychologists first proposed inoculation in the 1960s, driven by fears of propaganda and brainwashing during the Cold War. But a 21st Century version targets our modern misinformation landscape, one more preoccupied with political divisions and culture wars.  

Hot topic

Take climate change. More than 97% of climate scientists have concluded that humans are responsible for global warming, but large groups of society still have trouble believing it. When asked what percentage of climate scientists agree that human-caused global warming is occurring, only 49% of Americans thought it was more than half – and only 15% answered, correctly, that it was more than 91%. The confusion reflects sophisticated campaigns aimed at sowing doubt among the public.

Once doubt settles, it is hard to dislodge it

The difficulty is that once doubt settles, it is hard to dislodge it. Van der Linden and his colleagues wondered what would happen if they reached people before the nay-sayers did.  

They dug up a real-life disinformation campaign: the so-called Oregon Petition, which in 2007 falsely claimed that over 31,000 American scientists rejected the position that humans caused climate change.

Debunking misinformation after the fact is surprisingly ineffective – but what about ‘pre-bunking’? (Credit: Getty Images)

Debunking misinformation after the fact is surprisingly ineffective – but what about ‘pre-bunking’? (Credit: Getty Images)

The team prepared three documents. First, they wrote a ‘truth brief’ explaining that 97% of climate scientists agree that humans are responsible for climate change. They also prepared a ‘counter-brief’ revealing the flaws in the Oregon Petition – for instance, that among the Petition’s 31,000 names are people like the deceased Charles Darwin and the Spice Girls, and that fewer than 1% of the signatories are climate scientists.

Finally, they surveyed 2,000 people. First, they asked them how big the scientific consensus on climate change is – without looking at either document. Then they broke them into one group that saw the ‘truth brief’, one group that saw the Oregon Petition, and those who saw the ‘truth brief’ before the petition.

The results were intriguing. When participants first were asked about the scientific consensus on climate change, they calculated it to be around 72% on average. But they then changed their estimates based on what they read.

When the scientists provided a group with the ‘truth brief’, the average rose to 90%. For those who only read the Oregon Petition, the average sank to 63%.

When a third group read them both – first the ‘truth brief’ and then the petition – the average remained unchanged from participants’ original instincts: 72%.

“I didn’t expect this potency of misinformation,” says Van der Linden – the misinformation managed to completely ‘neutralise’ the correct data.

One study found that when people were given a factual brief, followed by a fake one, the misinformation completely neutralised the correct data (Credit: Getty Images)

One study found that when people were given a factual brief, followed by a fake one, the misinformation completely neutralised the correct data (Credit: Getty Images)

Enter inoculation.

When a group of participants read the ‘truth brief’ and also were told that politically motivated groups could try to mislead the public on topics like climate change – the ‘vaccine’ – the calculated average rose to almost 80%. Strikingly, this was true even after receiving the Oregon Petition.

The ‘counter brief’ detailing how the petition was misleading was more effective. One final group who read it before the petition estimated that 84% of scientists agreed that humans were responsible for climate change (of course, the actual number still is 97%).

In a separate piece of research, another team lead by John Cook asked a similar question and arrived at the same result: inoculation could give us the upper hand against misinformation.

They are flipping the approach, doing a pre-emptive strike and giving people a heads-up – Eryn Newman

“It’s an exciting development,” says Eryn Newman, a cognitive scientist and lecturer at the National University of Australia who was not involved in the studies. “They are flipping the approach, doing a pre-emptive strike and giving people a heads-up.”.

In other words, uncle Gavin may think twice before sharing that latest post about Brexit, Trump or whether the Earth is flat.

Why it works

Humans usually rely on mental shortcuts to think; the world is full of information and our brain has limited time and capacity to process it. If you see a wrinkled, grey-haired man and someone tells you he is a senior citizen, your brain accepts that and carries on.

People working with misinformation know this and use it to their advantage. For instance, the drafters of the Oregon Petition falsely claimed 31,000 scientists supported their claim because we tend to trust experts.

When information feels easy to process, people tend to nod along – Newman

“When information feels easy to process, people tend to nod along,” says Newman, who co-authored a review on how to deal with false information.

Before believing a piece of new information, most people scrutinise it in at least five ways, they found. We usually want to know if other people believe it, if there is evidence supporting this new claim, if it fits with our previous knowledge on the matter (hence the grey-haired man, who might fit your idea of a senior citizen), if the internal argument makes sense and whether the source is credible enough.

But at times we rely too much on shortcuts to answer these five questions. Our evaluation is not as thorough. We do not ask ourselves, “Hmm, how many of those are actually climate scientists?” Instead, we just accept the number “31,000 scientists” because it feels about right.

Before believing a piece of new information, people scrutinise it – but sometimes rely on shortcuts to do so (Credit: Getty Images)

Before believing a piece of new information, people scrutinise it – but sometimes rely on shortcuts to do so (Credit: Getty Images)

Psychologists call this more automatic way of thinking “System 1”. It is immensely helpful for daily life, but vulnerable to deceit. In our fast-paced information ecosystem, our brain jumps from one Facebook post to the next, relying on rules-of-thumb to assess headlines and comments and without giving much thought to each claim.

This is fertile ground for fake news. However, the teams working on the misinformation ‘vaccine’ believe their work allows for deeper thinking to kick in.

“Inoculation forces our brain to slow down,” says van der Linden. “There is a warning element.”

To appreciate this, it might helpful to understand how a vaccine (the actual medical procedure, not the misinformation metaphor) works.

Each time we receive a shot, we are showing our body a sample of a disease – a biological ‘mugshot’ small enough to avoid feeling really ill but sufficiently strong to provoke a reaction. This interloper startles our immune system into action and it starts building defences, or antibodies. When we come across the real disease, our body recognises it from the mugshot and is ready to strike back.

When the research team tipped off the participants that others might try to deceive them, they did not take the misinformation at face value

Something similar happened in van der Linden’s study. When his team tipped off the participants that others might try to deceive them, they did not take the Oregon Petition at face value. They overruled their System 1 thinking and nudged into replacing it with its cousin – the slower but more powerful thinking mode psychologists call System 2.

Those who read both the ‘truth brief’ and the Oregon Petition, and estimated the scientific consensus at 72%, perhaps relied more on the faster and more superficial System 1. But as the ‘vaccine’ startled their brain into switching to System 2, the two last groups remembered the ‘knowledge mugshot’ from the counter-brief and distrusted the petition. That could explain the higher estimate in the later groups.  

Game on

There is one great weakness of this approach: it takes a lot of time and effort to go case by case, inoculating people.

While the inoculation approach is effective, the difficulty is that it takes time and effort to ‘vaccinate’ everyone (Credit: Getty Images)

While the inoculation approach is effective, the difficulty is that it takes time and effort to ‘vaccinate’ everyone (Credit: Getty Images)

Let us stretch the vaccine metaphor a bit more. Having a shot against rubella, for instance, will not keep you from getting measles or hepatitis, as it will only create antibodies against the rubella virus. Similarly, if you receive the counterarguments to climate denial, you might still be vulnerable to fake news on other topics.

You can’t ‘pre-bunk’ every story because you don’t know where the next deception is coming from – Jon Roozenbeek

“There are millions of topics out there on which you can deceive people,” explains Jon Roozenbeek, who joined van der Linden’s team in 2016. “You can’t ‘pre-bunk’ every story because you don’t know where the next deception is coming from.”

The other problem is that people don’t like being told what is true and what is false. We usually think we know better. That is why pedagogy experts usually advise educators to provide students with an active role in learning.

So the Cambridge researchers went back to the lab until they came up with a new idea.

“What if we taught people the tactics used in the fake news industry?” van der Linden recalls of his thinking at the time. “What better way to prepare them?”

The result was a role-playing game in which participants could play one of four characters, from being an ‘alarmist’ to impersonating a ‘clickbait tycoon’. The game focussed on fake news strategies, rather than topics.

Once the offline fake news game proved effective on a test with Dutch high school students, they scaled it up to an online version called Bad News with the help of the collective DROG.  

Navigating the online game takes less than 15 minutes – but it is a surreal experience. You launch a fake news site, become its editor-in-chief, purchase an army of Twitter bots and direct your followers against a well-meaning fact checker. By the time I surpassed 7,000 followers, I felt slightly uneasy at how addictive it was.

Throughout the game, you learn six different techniques used by fake news tycoons: impersonation, emotional exploitation, polarisation, conspiracy, discredit and trolling. The idea is that the next time someone tries to use the tactics against me on social media, I should recognise them and be able to call them out. Or, at least, an alarm will go off somewhere in my brain and the automatic and easy System 1 process will take the back seat as my mind subs in System 2 for a closer scrutiny. One can only hope.

A fake news game teaches players six techniques used, including impersonation, emotional exploitation, polarisation, conspiracy, discredit and trolling (Credit: Getty Images)

A fake news game teaches players six techniques used, including impersonation, emotional exploitation, polarisation, conspiracy, discredit and trolling (Credit: Getty Images)

It seems a bit counterintuitive to fight fake news by teaching people how to become a misinformation mogul, but Roozenbeek trusts the experiment. He is also amused, if only slightly, at my discomfort.

“If you get a shot, you might feel a bit nauseous later that day,” the PhD student assured me, “but it helps you in the long term.”

The pair drafted an academic paper with the results of 20,000 players who agreed to share their data for a study. Although unpublished, they say the results are encouraging.

A shorter version of the game is on display at an exhibition at the London Design Museum in which people can play the part of an information agitator in post-Brexit Britain.

In their fondness of the inoculation metaphor, the Cambridge team members speak hopefully of the online game, now to be translated to over 12 languages. Van der Linden expects people can get “herd immunisation” if it’s sufficiently shared online. Roozenbeek talks about “general immunity”, since the game doesn’t target one specific topic but the general use of fake news.

Ultimately, it will also have to pass the test of time: researchers do not know how long the benefits of the inoculation would last, if the inoculation works at all. As a virus, disinformation moves in a rapidly changing environment and adapts quickly to new conditions.

“If the virus changes, will people still be protected?” asks Newman, who wonders whether the game will stand the ever-changing nature of online trolling and disinformation.  

In other words: will this boost to your uncle Gavin’s mental defences last until the next election cycle?

Diego Arguedas Ortiz is a science and climate change reporter for BBC Future. He is @arguedasortiz on Twitter.

Join 900,000+ Future fans by liking us on Facebook, or follow us on Twitter or Instagram.

If you liked this story, sign up for the weekly bbc.com features newsletter, called “If You Only Read 6 Things This Week”. A handpicked selection of stories from BBC Future, Culture, Capital, and Travel, delivered to your inbox every Friday. 

;