Our newspapers, TV screens and social media feeds are full of pundits who claim to be able to see the future. Often they’re right; many times, they’re wrong.
Could a complete novice – with no prior experience in forecasting – learn to predict world events with better accuracy than the experts? And how could those rational thinking skills be applied in everyday life?
These were the questions that BBC Future hoped to answer when it teamed up with Nesta, the UK innovation foundation, and Good Judgment Inc, a spin-off from an influential four-year project to identify the best forecasting techniques. (Read more about this research and the skills you need to predict the future.)
You might also like:
The result was the You Predict the Future Challenge, which allowed any of our readers to predict the outcomes of various world events.
How, for instance, would the UK economy fare with the increasing uncertainty over Brexit? Between June and October 2019, the participants were asked to predict the value of the pound against the Euro on 1 November 2019, by estimating the probabilities of the various potential outcomes (less than 1.0, between 1.00 and 1.10, more than 1.10 but less than 1.20 etc).
The accuracy was calculated using a formula that rewarded the participants for the probability they assigned to the correct outcome (in this case “more than 1.10 but less than 1.20”) while penalising them for the probabilities they assigned to the wrong answers. Since some questions were inherently harder than others, the formula also considered how well they performed relative to the other players, which helped to account for different levels of difficulty.
Thousands entered, and the data has been analysed to see who made the best forecasts. The results hold some thought-provoking lessons. So, if you want to become a political forecaster yourself, wish to make wiser investments and business decisions, or simply want to better understand the world around you, it is worth reading on.
Novices beat more experienced forecasters
The participants taking part in the You Predict the Future challenge were asked whether they had previous forecasting experience before taking part. You might expect previous practice to have increased performance, but this seemed not to be the case. In this challenge, the novices appeared to perform better.
One reason could be that some people were simply misjudging their level of experience. The assessment of experience relied upon participants self-reporting. Perhaps they just had a passing knowledge of very basic probabilistic reasoning from school or a little experience of gambling on world events, for example, and believed that was far more impressive than it really was. “And because they believe that they have experience, biases might come into play,” says Eva Chen, chief scientist at Good Judgment Inc.
People who perceive themselves to be experts can also be more closed-minded and less likely to learn from others
If so, it may be a case of the Dunning Kruger effect, which describes the tendency of people with a low ability to over-estimate their skills. People who perceive themselves to be experts can also be more closed-minded and less likely to learn from others, compared to people with a humbler mindset – a phenomenon known as “earned dogmatism”.
Whatever the explanation, the result should offer some encouragement to other novices who would like to try their hand at forecasting (Read further down to find out what it took to enter the top 10).
“I think it's quite emboldening for anyone who is interested in understanding how they might use those kinds of skills to better understand the way the world is changing around them,” says Aleksandra Berditchevskaia, a senior researcher for Nesta’s Centre for Collective Intelligence Design. “Even if you don't have experience, even if you're not an expert, you can get started with this relatively easily, and you won't perform that badly at all. You might even perform better than some of the experts.”
Millennials out-performed boomers
If experience failed to make much difference, advanced age didn’t seem to improve forecasting ability either. Older is not necessarily wiser when it comes to forecasting.
Overall, 25-to-35-year-olds were the most accurate age group in the You Predict the Future challenge. The data from the tournament doesn’t offer firm clues about why that would be the case – and Chen emphasises that the finding would need replicating with other studies.
Constantly assessing and updating the information you use can have a dramatic impact on the accuracy of your predictions (Credit: Getty Images)
The finding does, however, seem to mirror known differences in information literacy and critical thinking across generations. A study from researchers at Princeton and New York University, published in the journal Science last year, found that the tendency to share fake news increases with age, with over-65s roughly seven times more likely to post misinformation than an 18-to-29-year-old. Pew Research, meanwhile, found that older people also find it harder to differentiate between factual statements and pure opinion. It’s possible that the younger participants were basing their forecasts on more objective information, while filtering out the misleading data.
“The information landscape has changed so much in terms of where we're getting different sources of information,” says Berditchevskaia. “I think it’s natural that generations who have grown up within those ecosystems are much better able to navigate them effectively.”
Warren Hatch, the chief executive of Good Judgment Inc, points out that younger people have higher “fluid intelligence” – the ability to solve novel problems, which is tightly linked to working memory. It could be that this outweighed the greater knowledge that comes with age, he suggests. Whatever the reason for this particular result, he argues that a range of ages – including young and old – could still be useful for teams of forecasters, since increased “cognitive diversity” appears to improve group problem solving.
The best forecasters analysed each of the possibilities in depth and constantly updated their predictions
Intriguingly, US participants performed better than those in the UK on questions about Brexit. Chen warns not to read too much into this finding without further research, but it’s tempting to speculate that the Americans had less polarised pre-existing opinions on the subject, meaning their forecasts were a little bit more objective.
What it took to enter the top 10
Like many of the participants in You Predict the Future, Nigel Alderton had no previous experience of forecasting – he works in IT – but had been intrigued after reading the book Superforecasting by Philip Tetlock, one of the founders of the Good Judgment Project. “I was just curious,” he says. “I've never tried anything like that before.”
He soon came to enjoy the intellectual challenge of researching new subjects and making the predictions, and the questions about Brexit helped him to think more deeply about the events dominating the news. “I felt like I was sort of involved in the politics, rather than just shouting at the television,” he says.
One question on the relative likelihood of seven different Brexit-related events proved to be particularly fiendish, he says. When he created flow charts representing the outcomes, “it was like a tangled piece of spaghetti of all the various different things that could happen,” he says. “And the situation was changing all the time. Some of the votes in the commons were on a knife's edge and could easily have gone either way.” The best forecasters, including Alderton, analysed each of those possibilities in depth and constantly updated their predictions as new information came available.
Cases of measles tend to follow seasonal patterns, which means they peak in the spring in temperate climates and during the rainy season in the tropics (Credit: Getty Images)
Alderton finished the challenge in fifth place. His advice to beginners would be to “research, research, research” each question, and to make sure you explore different alternative hypotheses before coming to a solution, when the temptation may be to stick with the first good idea that comes to mind). When asked to estimate the number of cases of measles in the US in 2019, for instance, many participants failed to consider the possibility that measle infections followed seasonal patterns. Alderton found that the rate of infections had already reached its peak, and he adjusted his estimates accordingly. (Get some further advice on making good predictions from Nesta.)
Why humility improves your thinking
Thomas Dash, who came 43rd out of more than 7,000 participants, offers similar suggestions. Like Alderton, he is primarily motivated by curiosity rather than a sense of competition. “I like questions that give me the excuse to do some research into something that I know little about,” he says. It is “an excuse to use my brain, a mental reward for getting things right, no stress if I get it wrong”.
Dash says that previous work as a computer programmer (he is now retired) helped teach him the open-minded mindset that is crucial for making accurate forecasts. “Every new program will have bugs, so if you try to be arrogant you will surely be found out,” he says. “Be humble and accept that everyone makes mistakes – the skill is owning up early and fixing them.”
He says that many software projects also required him to consider others’ viewpoints – imagining how the system might be used and the difficulties that might arise. “You need to see things as each of the actors sees them, which is the same as in forecasting.”
Strong feelings about the Brexit vote in the UK may have influenced people's predictions about the outcome of key events (Credit: Getty Images)
The participants were allowed to share their thinking with other forecasters, and both Alderton and Dash emphasise that civility was essential. “Even if you disagree with [someone’s] view, you can learn something from it,” says Alderton. “But it's very easy to trigger people and get into an argument. And so I learned that if you take care when challenging or questioning somebody's forecast, and you get much better response that might improve your own forecast.” Just simple things – such as emphasising the fact that you are not criticising someone personally – were important for creating a more constructive discussion, he says.
And this can have some interesting effects on the way you see the world. Barbara Mellers, at the University of Pennsylvania, has found that the participants in forecasting tournaments tend to become more moderate in their political views – perhaps because, as Alderton and Dash both say, they regularly step into others’ shoes and deliberately look for information that challenges their assumptions.
“Now, as a result of this, I instinctively seek out the opinion of people that I know I disagree with,” says Alderton. “You just get so much more out of it.”
In today’s uncertainty, an ability to see beyond partisan divides could be more important than ever before. The You Predict the Future challenge is over. But if you are curious, humble and open-minded, there are many more opportunities to hone your forecasting skills at Good Judgment Open.
* To learn more about the results of the You Predict the Future challenge, read this detailed update from Nesta.
If you liked this story, sign up for the weekly bbc.com features newsletter, called “The Essential List”. A handpicked selection of stories from BBC Future, Culture, Worklife, and Travel, delivered to your inbox every Friday.