From claims that The Simpsons predicted Trump’s presidency in the Year 2000 to the absurd story that the Queen has joked about assassinating Trump, our social media feeds are awash with lies and misunderstandings.

Just consider the 29 January shootings in a Quebec mosque, by Alexandre Bissonnette. In just a few hours, conspiracy theories had begun to percolate, claiming a police cover-up to protect a Muslim accomplice.

As David Mikkelson, the co-founder of myth-busting site Snopes, puts it: “The bilge is rising faster than you can pump.” Tellingly, Snopes’s own traffic almost doubled – to 13.6 million monthly visitors – in October last year, as readers struggled to make sense of the events leading up to the election.

Fortunately, psychologists are beginning to understand why we accept dubious claims that support our own viewpoint while neglecting facts that disagree with our views. In this round-up of our previous content, we explore six strategies you can adopt to avoid being fooled.

Don’t be seduced by simplicity

A series of studies have shown that it is surprisingly easy to mask a lie in the veil of credibility, by making it sound so obvious it must be true. Often, this revolves around the “cognitive fluency” – whether the idea is easy to process. Simply printing a story in an easy-to-read font can do the trick. For the same reason, we are also more likely to trust someone if they feel familiar (if they have appeared on TV a lot, for instance) – even if they clearly lack expertise in what they are saying. Try questioning your sources and look beyond the slick presentation.  

Read more: Why are people so incredibly gullible

Be smart to doctored images

Images can also increase a story’s cognitive fluency, but thanks to software like Photoshop, they can now be easily doctored, and you may not realise just how easily this can manipulate your memory of history. The site Slate once ran an experiment, in which they showed pictures of certain political events – only some of which were real. When questioned afterwards, nearly half their readers claimed to have remembered the fake events actually occurring. It’s just one method of subtle suggestion that could lend credibility to a lie. So try to look for multiple sources of information, and don’t just rely on the evidence immediately in front of your eyes.

Read more: How fake images change our memory and behaviour

Accept your ignorance

Many people suffer from over-confidence – the belief they know more than the average person. And our smartphones – with infinite knowledge at our finger-tips – can exacerbate this effect. As a result, we may feel less critical of the information that reinforces our assumptions, while dismissing anything that disagrees with us.


Read more: The man who studies the spread of ignorance

Look beyond your bubble

As Zaria Gorvett explains in her story on ‘group polarisation’, people naturally converge on the views of those around them – in both their physical and virtual neighbourhoods. So try talking to people with different views from your own, and look to news sources you wouldn’t normally read. You might be surprised to find information that questions the facts you took for granted.

Read more: The reasons why politics feels so tribal in 2016

Be curious

Along similar lines, psychologist Tom Stafford suggests that we could all benefit from being more curious. Whereas education alone does little to prevent polarised thinking, people who are more curious appear to appraise scientific evidence in a more balanced way – so that they are not blinded by their existing ideology.

Read more: How curiosity can protect the mind from bias

Consider the opposite

You may also benefit from the following strategy found in a vintage psychology paper. As Stafford describes in his piece, participants were asked to read articles about the death penalty, with the following instructions: “Ask yourself at each step whether you would have made the same high or low evaluations had exactly the same study produced results on the other side of the issue.”

So, for example, if presented with data suggesting the death penalty lowered murder rates, the participants were asked to analyse the study's methodology and imagine the results pointed the opposite way. The technique turned out to reduce the participants’ confirmation bias – their tendency to discount evidence that did not agree with their existing beliefs, while leading them to be more critical of the evidence that supported their assumptions. As a result, they came to a more balanced opinion overall.

Read more: How to get people to overcome their bias

---

David Robson is BBC Future’s feature writer. He is @d_a_robson on twitter.

Join 800,000+ Future fans by liking us on Facebook, or follow us on TwitterGoogle+LinkedIn and Instagram.

If you liked this story, sign up for the weekly bbc.com features newsletter, called “If You Only Read 6 Things This Week”. A handpicked selection of stories from BBC Future, Earth, Culture, Capital, Travel and Autos, delivered to your inbox every Friday.