Drew Linzer: The stats man who predicted Obama's win

Nate Silver of the New York Times explains the science of presidential predictions

Pundits insisted the presidential race was a toss-up, but "polling aggregators" - who analyse polls to make predictions - were being criticised for favouring President Obama. Not any more.

In September we called Drew Linzer, an assistant professor of political science at Emory University, to ask for his predictions for the upcoming US presidential election.

Linzer runs the website Votamatic, which uses current election polls and past historical trends to predict the outcome of major elections. He gave the same prediction he had been posting on his site since 23 June.

Obama 332 votes, Romney 206.

Weeks later, the first presidential debate, when Obama's lacklustre performance kicked off a surge of momentum for the Republican challenger Mitt Romney, Obama's election odds had sunk like a stone in national polls, and states once considered toss-ups were being assigned as favourites for Romney.

Asked again for his updated prediction, Linzer gave the same answer.

No change, he said: Obama 332 votes, Romney 206.

Now, Obama has been elected to a second term, and election workers are still counting the votes in Florida, which is leaning ever so slightly towards the Democrats. The Romney team admitted to the Miami Herald that they had lost the state, though it has not been officially called. When it is, the final tally in this once too-close-to call election will be:

Obama, 332 votes, Romney 206.

Aside from Barack Obama himself it is people such as Linzer - along with his contemporaries Nate Silver, who writes the Five Thirty Eight blog at the New York Times, and Sam Wang, co-founder of the Princeton Election - who may be this November's big winners.

Start Quote

Drew Linzer

The polling this year has been remarkably stable”

End Quote Drew Linzer

In a race that many old-school pundits said was too close to call, Linzer, Silver and Wang, who all run websites that use some version of voter aggregation and statistical analysis to predict elections, had Obama as a clear favourite with a slim but persistent lead.

"We really shouldn't be all that surprised that our methods 'worked' on election day," says Linzer.

"All this proves is that public opinion research is still a reliable and accurate way to learn about people's voting preferences… as we've known all along. [There's] no need to go on gut instincts or intuition or whatever else the pundits are doing, when we have actual real information," says Linzer.

But those who built their living on gut instinct and intuition were surprised. For weeks, journalists and pollsters were convinced that the work of Linzer, Silver and Wang was politically biased or that their maths was wrong.

These men and their statistical models have now been proven correct - and that means re-evaluating what we think we know about politics, polling, and how to win the presidency.

'Ideologues'?

These aggregators are based on a simple premise.

"Pollsters individually make mistakes, no matter how well-constructed their polls are, but in the aggregate they are quite sound," says Sam Wang, who in his day job is an associate professor of molecular biology and neuroscience at Princeton University.

And in the past two election cycles, the number of state polls being conducted - along with an advance in computing technology - has allowed those polls to be aggregated, weighted and indexed to produce a clear probability of how people will vote.

How they work

  • FiveThirtyEight (Nate Silver) Takes into account state and national polling as well as economic indicators, then uses an in-house model to try to eliminate bias and errors
  • Princeton Election Consortum (Sam Wang) Calculates win probability from the median of recent polls; uses probabilities to calculate the distribution of electoral votes
  • Votamatic (Drew Linzer): Combines state polls with historic data to predict probable outcomes

Read more in-depth comparison at the Princeton Election Consortum

Each of the aggregating websites uses a slightly different formula to come to their results, whether it's looking at historical trends or including economic data and other outside factors to temper the result on voters. But in 2012, all of the websites ran thousands of models predicting a probable win for Barack Obama.

"The polling this year has been remarkably stable," says Linzer, and even though it dipped after Obama's disastrous debate performance, it wasn't enough to radically shake the aggregate predictions - even though individual polls might be fluctuating.

That led to a steady stream of criticism, with Silver - the most widely read - taking the brunt of the abuse from more traditional election-watchers.

Joe Scarborough, a former Congressman and the host of MSNBC's Morning Joe programme, said: "Anybody that thinks that this race is anything but a toss-up right now is such an ideologue they should be kept away from typewriters, computers, laptops and microphones for the next 10 days, because they're jokes."

Critics said the formulae each aggregator used had built-in bias. But for Linzer and his colleagues, their sites aren't about political machination, but impartial maths.

332
Barack Obama
206
Mitt Romney
  • California 55
  • Colorado 9
  • Connecticut 7
  • District of Colombria 3
  • Delaware 3
  • Florida 29
  • Hawaii 4
  • Iowa 6
  • Illinois 20
  • Massachusetts 11
  • Maryland 10
  • Maine 4
  • Michigan 16
  • Minnesota 10
  • Nebraska 0
  • New Hampshire 4
  • New Jersey 14
  • New Mexico 5
  • Nevada 6
  • New York 29
  • Ohio 18
  • Oregon 7
  • Pennsylvania 20
  • Rhode Island 4
  • Virginia 13
  • Vermont 3
  • Washington 12
  • Wisconsin 10
  • Alaska 3
  • Alabama 9
  • Arkansas 6
  • Arizona 11
  • Georgia 16
  • Idaho 4
  • Indiana 11
  • Kansas 6
  • Kentucky 8
  • Louisiana 8
  • Maine 0
  • Missouri 10
  • Mississippi 6
  • Montana 3
  • North Carolina 15
  • North Dakota 3
  • Nebraska 5
  • Oklahoma 7
  • South Carolina 9
  • South Dakota 3
  • Tennessee 11
  • Texas 38
  • Utah 6
  • West Virginia 5
  • Wyoming 3

"State polls have a very good track record, and if that track record is maintained, then what the state polls are telling us is quite clear," Wang said before the election.

"If the election turns out a different way, then the question isn't whether my math is wrong, because my math is quite sound, it's what's up with these state polls."

That's a very different approach from traditional punditry, where value is placed on perceived momentum, age-old political adages and gut instinct.

"One of the values in doing it our way, in which there's a system, is it's all in black and white," says Linzer.

"If it turns out there's a flaw, we can find it, spot it and we can work on addressing it as opposed to people whose commentary is based on some thoughts in their head."

Polling is an obsession in the US, and during this campaign schedules were organised around the 13:00 EST release of Gallup's national tracking poll.

What the pollsters think

Graph and computer code

"If we don't do the polling these aggregators have nothing to put into their model, [but] they sit back and take the benefit of our hard work and our toil.

Already the number of state polls conducted this year was lower than last time. If everybody decides they're just going to aggregate in the 2016 presidential election they'll have no polls left to aggregate.

So I think as an industry we really have a little issue here about the virtues of doing original polling versus just sitting back and taking other peoples' polls and putting them in models."

Frank Newport, head of the Gallup Polling organisation, interviewed for BBC radio's More Or Less. Click on the link to listen to the programme after 13:50 GMT on Saturday.

Much of the last few weeks of this year's election was focused on who was really winning and what the polls really meant.

Wang originally started his site in the hopes of calming some of the polling mania by providing a clear look at what the polls really said. The time spent trying to read the tea leaves, he hoped, would be better spent discussing the issues.

A proven model that correctly predicted outcomes could transform the conversation from a discussion about who might win into one about why someone is going to win. But Wang doubts it. "People love a horse race," he said.

But the potential power of these numbers to disrupt the typical politico patter was evident even in this election. Even as pundits were fighting about the value of aggregation, the narrative that Mitt Romney was riding a wave of momentum was tempered and in some cases walked back in the face of the unrelenting statistics.

After the results were in, journalist Dan Lyons wrote: "Nate Silver and his computers may not put Scarborough and his ilk out of business - there's loads of airtime to fill, and windbags are still needed for that.

"But Silver has exposed those guys for what they are, which is propagandists and entertainers."

Increasingly, those who run campaigns are putting more faith in the value of numbers instead of the conventional wisdom of pundits and polls. Witness Obama's successful re-election campaign, based in large part on micro-targeting and data analysis.

But Linzer is convinced the two methods can co-exist. "What they do is incredibly valuable and I don't think what I do replaces that in any way," he says. "I feel like we're all working towards a common goal, which is accuracy and understanding."

It's easy to see why the old guard would feel threatened. That model was based on the predictive power of spin and narrative. It valued gut feeling, and said that you can change the polls if you spin them convincingly enough.

It's no wonder many bristled at a system that stripped all the emotion and intuition from the process.

And yet the system was right - which Linzer could have told you in the first place.

More on This Story

US Presidential Election 2012

Features & Analysis

Elsewhere on the BBC

  • ITChild's play

    It's never been easier for small businesses to get their message out to the world

Programmes

  • Joe Ierardi playing a pianoClick Watch

    Meet the man trying to create the perfect digital piano - but is it as good as the real thing?

BBC © 2014 The BBC is not responsible for the content of external sites. Read more.

This page is best viewed in an up-to-date web browser with style sheets (CSS) enabled. While you will be able to view the content of this page in your current browser, you will not be able to get the full visual experience. Please consider upgrading your browser software or enabling style sheets (CSS) if you are able to do so.