No, there is no link between the coronavirus and 5G - nor is there any truth in the claim that Bill Gates is involved in a devilish plot to put a chip in everyone.
A tidal wave of conspiracy theories and misinformation is swamping social media and on Tech Tent this week we look at the impact of what has been called an infodemic, and what can be done about it.
YouTube has long been seen as a place where people get drawn into conspiracy theories - watch one flat earth video and you end up gorging on dozens.
In fact, YouTube's engineers have made strenuous efforts recently to break this chain of addiction, tweaking its algorithms so they don't recommend ever more extreme videos.
But, according to researchers from the Oxford Internet Institute, that work may have been in vain, because the videos end up getting shared on other social media platforms.
"We found around 8,000 videos that YouTube had removed because they believe that they contained false information such as unscientific claims about coronavirus treatments," Aleksi Knuutila from the Institute tells the programme.
"We also found that these videos had gathered a fairly large audience - they have been shared around 20 million times on social media. We found evidence that it was specifically Facebook that was responsible for the distribution of these videos."
Facebook does have processes for checking misinformation, and labelling it as such, but Aleksi Knuutila says they don't appear to be working very well - just 1% of the YouTube videos had warning labels on them.
When we asked Facebook to respond to the Oxford study, the social media firm gave us this statement: "Facebook does not allow harmful misinformation on our platforms and we have removed seven million pieces of Covid-19 related misinformation between April and June.
"During the same period, we put warning labels on about 98 million pieces of COVID-19-related misinformation globally, which prevented people viewing the original content 95% of the time."
'Degrees of discomfort'
What is the evidence that conspiracy theories and misinformation are changing what people think?
Well, it seems that the huge volume of false claims about the threat to human health from 5G mobile networks is having an impact. Paul Lee from Deloitte tells us about a survey the consultancy has done about public attitudes to 5G across Europe.
This shows that the UK has fewer concerns about 5G and health than many countries - just 14% said they were worried, compared to around a third of Austrians. But interestingly, the figure was 18% for 25-to-34 year olds - people who are presumably both tech-savvy, and likely to spend a lot of time on social media.
New technology always causes anxiety, but Paul Lee says this is on a different level: "With 5G, there are degrees of discomfort that I've never seen before. And I think partly what's happened is there is far more ability to share misinformation."
A huge test of their ability to tackle misinformation is looming for social media platforms - and in particular Facebook - in the form of the US election. And now we know that Facebook's long-awaited supreme court, its Oversight Board, will be up and running in October - just in time.
But the co-chair of the board, the former Danish Prime Minister Helle Thorning-Schmidt, tells us not to expect instant results.
She says quality, rather than speed, will guide the body's decision-making: "We are not here to have snap opinions about things. We are here to take principled decisions that Facebook has to follow."
And she stresses that if the rulings of this supreme court are ignored, its members won't stick around.
"If Facebook doesn't follow our decisions, it won't last very long because we have signed on to this task only because of the transparency and independence and the obligation from Facebook - including of course Mark Zuckerberg - to follow our decisions. So that is the red line for us," she says.
And even before the Oversight Board gets started, there are questions over its legitimacy.
A group which includes academics, civil rights campaigners and a former Facebook investor have launched a rival board to scrutinise the company's role in the election, warning that it poses an urgent threat to democracy.