How wrong was the election polling?
Once again the polls, taken as a whole, were not a good guide to the election result.
Over the course of the campaign the gap between the main two parties narrowed but, with one exception, the final polls all suggested a clearer Conservative lead than the actual outcome.
Having said that, it wasn't an unmitigated disaster. Every poll throughout the campaign put the Conservatives ahead - and that was indeed the result.
The final polls were fairly accurate about the Conservative and Lib Dem shares. It was Labour where they were uniformly wrong. They also overestimated UKIP and the SNP.
|Con %||Lab %||LD %||UKIP %||Green %||SNP %||Con lead%|
|Actual result (GB)||43.5||41.0||7.6||1.9||1.7||3.1||2.5|
Survation were closest to the actual result. Kantar Public's numbers were also reasonably good.
YouGov's final poll, like most of the others, seriously underestimated Labour. Prior to that, they had been suggesting a closer race. They also had a separate seat projection model which had been indicating a hung parliament. That had been met with a lot of scepticism but, with hindsight, was pretty accurate.
In the final weeks of the campaign, the polls were often criticised for being "all over the place". It's true that they were pointing to very different outcomes.
- BBC Election Live - rolling text and video updates
- Quick guide to what's going on
- Kuenssberg: PM survives first bout of battle for control
That variation clearly made it difficult to interpret what they were saying. It's surely better, though, that they had different numbers than that they were all wrong in exactly the same way. If you're looking for consistent accuracy then opinion polls are probably not for you.
Pollsters are also sometimes accused of herding - deliberately manipulating their figures so they all say the same thing. That accusation can't be levelled at this election.
In 2015 the polls went wrong because their samples were not representative of the electorate - they contained too many Labour voters. They also failed to estimate the difference in turnout rates between different age groups - they overestimated turnout among young voters.
The pollsters who were furthest from the actual result this time were those, like ICM and ComRes, who had taken the strongest measures to try to rectify the problem from 2015. Survation made no significant changes to their methodology and came out on top.
It looks as though the errors this time were caused, at least in part, by fighting the last war.
We'll never know the exact figures for turnout among young voters - it's a secret ballot remember - but YouGov's post-election estimate puts it at 57% for 18-19 year olds and 59% for 20-24 year olds.
That's lower than for older voters but considerably higher than estimates for young voters in 2015.
We can also see that the places where the number of voters increased the most were generally those with young populations. The assumption made by some pollsters that young turnout would continue to under-perform was probably wrong.