Coronavirus testing: What is a false positive?

By Simon Maybin and Josephine Casserly
More or Less, BBC Radio 4

Published
Related Topics
Image source, Reuters

There has been a lot of talk on social media about "false positive" test results after several commentators suggested they might be seriously skewing the coronavirus figures - but that is based on a misunderstanding of the impact of false positives.

Talk Radio host Julia Hartley-Brewer has claimed that "nine out of 10 of the positive cases of Covid we are finding in the community when we do random testing, when anyone just puts themselves forward, will be wrong. They will not be people who have got coronavirus."

Could it be true that 90% of positive results from tests in the community - that means tests not carried out in hospitals - are false? The answer is "no" - there is no way that so-called false positives have had such an impact on the figures.

Hartley-Brewer referred to both "random testing" and "anyone [who] just puts themselves forward", which are different things, and the difference between them is important.

Also, there are many other signs that the rising number of positive tests is truly reflecting the virus spreading, for example a subsequent rise in Covid hospitalisations.

What is a false positive?

A false positive is when someone who does not have coronavirus, tests positive for it.

No test is 100% accurate - there will always be some people who test positive when they do not have the disease, or test negative when they do have it.

False positives in any testing programme are important - especially when there is low prevalence of a disease - because they could potentially make us think there are significantly more cases of something than there really are.

The false positive rate usually refers to the number of people who are not infected but get positive results, as a proportion of all the people tested who really don't have the virus. We do not know what the precise rate is though.

Dr Paul Birrell, a statistician at the Medical Research Council's Biostatistics Unit at the University of Cambridge, says: "The false positive rate is not well understood and could potentially vary according to where and why the test is being taken. A figure of 0.5% for the false positive rate is often assumed."

Randomness is the key

The most important thing to know about the impact of false positives is that it varies hugely depending on who is being tested.

What Hartley-Brewer said confused the idea of random testing with community testing for Covid. Those are two different situations, and false positives have a very different impact in each case.

If you tested 1,000 people at random for Covid-19 in early September, for example, data from the Office for National Statistics (ONS) infection study suggests you should have expected one of them to actually have the virus.

With a false positive rate of 0.8% - a figure used by Ms Hartley-Brewer and within the broad range of what we think might be the actual rate for community testing - you would get eight false positives. So in that context, it's true that roughly 90% of positives would be false.

But - crucially - the people going for community testing for Covid-19 (at places such as drive-through centres) are not a random sample of the public. They are people who have symptoms, are in care homes or are in hot-spot areas.

Image source, PA Media

Figures for late September from Public Health England show that 7% of community tests were positive. That means of every 1,000 people tested, 70 were positive. Even with a false positive rate of 0.8%, seven of those would be false positives, but 63 would be true positives - the vast majority.

So the daily case count is not being skewed significantly by false positives. There will also be some false negatives, meaning that some people who actually have Covid are not being counted.

When we put it to Hartley-Brewer that she had misinterpreted explanations of the impact of false positives, she pointed us to other articles that also discussed the impact on random samples of the population, rather than on people who are much more likely to have the virus.

There are other signs of real positive results

People do not get admitted to hospital by false positives, so if more people are in hospital with Covid, then you can be pretty sure that is due to genuine cases. The same is true of the number of deaths.

Dr Birrell says that to be certain cases really are increasing, the daily case count "should always be considered alongside other information sources, such as the hospitalisations or deaths, or the community surveys run by the ONS or REACT".

Correction 7 October: The piece was corrected to show that 63 out of 1,000 people tested would be true positives if there was a 7% positivity rate and a 0.8% false positive rate.

More or Less is on BBC Radio 4 at 09:30 BST on Wednesdays and repeated at 16:30 on Fridays and 20:00 on Sundays. It's also available on BBC Sounds.