US intelligence agencies hope the "wisdom of the crowd" can help them predict the future.

The upcoming release of the James Bond movie Skyfall, combined with the 50th anniversary of the franchise, has sent fans of the suave spy into overdrive. Speculation about the plot and who will sing the theme tune (Adele),  has been joined by stories that pick apart everything from his ingenious - but scientifically dubious - gadgets to the ins and outs of his sex life

But now, James Bond fans and wannabe spies alike may have the ultimate outlet for their spy ambitions. Research firm Applied Research Associates, has just launched a website that invites the public—meaning anyone, anywhere—to sign up and try their hand at intelligence forecasting. The website is part of an effort, sponsored by the Intelligence Advanced Research Projects Activity (Iarpa), to understand the potential benefits of so-called crowdsourcing for predicting future events. Crowdsourcing aims to use the “wisdom of crowds” and was popularised by projects like Wikipedia.

Like Darpa, its better-known counterpart in the Pentagon, Iarpa funds far-out research ideas. However, Iarpa works on ideas that could eventually be used by the likes of the Central Intelligence Agency (CIA), rather than the military. “The goal that Iarpa has is to eventually transition this to the intelligence community, and use it for something like the National Intelligence Estimates,” says Jenn Carter, who works on the project.

There’s good reason for Iarpa’s interest in finding new ways to collect useful information: the intelligence community has often been blasted for its failure to forecast critical world events, from the fall of the Soviet Union to the Arab Spring that swept across North Africa and the Middle East.  It was also heavily criticized for its National Intelligence Estimate in 2002, which supported claims that Iraq had weapons of mass destruction.

Those failures raised larger questions about how the intelligence agencies come up with forecasts, which is usually a deliberative process involving a large number of analysts.  The Iarpa project, known officially as Aggregative Contingent Estimation, is looking at whether crowdsourcing can result in more accurate forecasts about future events than those traditional forms of intelligence estimation.

Applied Research Associates actually started the project last year with another website called Forecasting Ace, which had over 2,000 registered contributors making predictions on everything from the future of space exploration to political elections.  On the new website, Global Crowd Intelligence, the company hopes that number will grow substantially by making forecasting more like a game of spy versus spy.

“When we contacted our contributors, they said we should try to make the whole process more fun,” says Dirk Warnaar, the principal investigator for the project.

Indeed, what users wanted, it turned out, was something competitive, so that’s what the company has given them. The new website rewards players who successfully forecast future events by giving them privileged access to certain “missions,” and also allowing them to collect reputation points, which can then be used for online bragging rights. When contributors enter the new site, they start off as junior analysts, but eventually progress to higher levels, allowing them to work on privileged missions.

The game works by allowing the newly minted analysts to choose from any number of “missions” to forecast. The forecasts incorporated in the website run the gamut, from those that obviously might be of theoretical interest to intelligence agencies, like a “government force will gain control of the Somali town of Kismayo before 1 November 2012,” to market predictions, such as the likelihood that Apple will introduce a mini-iPad by a specific date.  Analysts attach a specific percentage to the even, such as 80% probability, and then wager “reputation points” on their forecast. They win points, depending on how close their forecast matches reality. The more points you win, the higher you progress.

The idea of crowdsourcing geopolitical forecasting is increasing in popularity, and not just for spies.  Wikistrat, a private company touted as “the world’s first massively multiplayer online consultancy,” was founded in 2002, and is using crowdsourcing to generate scenarios about future geo-political events. It recently released a report based on a crowdsourced simulation looking at China’s future naval powers.

Warnaar says that Wikistrat’s approach appears to rely on developing “what-if scenarios,” rather than attaching a probability to a specific event happening, which is the goal of the Iarpa project.

Spies like us

Of course, the ultimate question is: how good are the crowd’s predictions? Warnaar compares this science to weather forecasting, which albeit imperfect, still provides useful and reasonably accurate information on future events. Part of what helps weather forecasters improve their prediction is constant feedback: if they predict rain, and they get it wrong (or right), they instantly learn.  “This constant feedback makes them well-calibrated,” says Warnaar.

In fact, this sort of “self-calibration” is how one of the crowdsourcing models works: if the “crowd,” predicts that an event is going to happen with an 80% probability, but in reality this should have been 60%  (crowds tend to be overconfident), then the model is able to aggregate all of the information to churn out a more accurate prediction.

The system is also designed to ensure that any efforts to sabotage forecasts are minimized. “Everyone can make forecasts but not all of those forecasts are included in our models and each forecast may have a different weight,” says Warnaar. “You would therefore have to be a consistently good forecaster to be able to influence the aggregate forecast with a rogue prediction, but even then your forecast must be consistent with your previous pattern.”

To catch any potential rogue elements, the system also flags up any unusual activity for further scrutiny. “So far we’ve not found any evidence that a single forecaster or group of forecasters was able to purposely skew the results,” he says.

The project is already yielding results: in the first year, Warnaar says, they were able to show that the crowdsourced forecasts were 25% more accurate than forecasts produced by a control group, which involved simply averaging the forecasts made by a number of individuals.  The plan is double that accuracy over the next year to 50%.

In addition to improving intelligence forecasts, the research may also yield other benefits, such as understanding what type of person is better at predicting future events. “There is very little research that points to what makes a good forecaster,” says Warnaar.

Those working on the current project are careful to note that the current project is about research, not spying. The names and personal information of the users are not provided to Iarpa, only the results of the forecast. Users entering the site must provide an email, but not a real name, and only answer two questions: whether they are over the age of 18 and if they are American citizens.

But for those who dream of being James Bond, or conversely, worry that their predictions could be used by spies, the website has a simple disclaimer: “Forecast topics are not related to actual intelligence operations.”

If you would like to comment on this story or anything else you have seen on Future, head over to our Facebook page or message us on Twitter.

Around the BBC