Loading

Why we find it difficult to recognise a crisis

Share on Linkedin
Woman in mask walking past hand graffiti (Credit: Getty Images)
The current pandemic has affected some countries more than others, partly because they have been slow to react to the crisis. That, it turns out, is a very human response.
T

The coronavirus pandemic is upon us, and for many people it feels like it came out of nowhere.

The UK saw its first reported cases at the end of January, by which time the virus was already spreading around the world. But it was not until the middle of March that UK Prime Minister Boris Johnson “advised” people to avoid non-essential travel and socialising, and only on 23 March did he order the country into lockdown. The slow UK response came in for widespread criticism from public health experts.

In the US, President Donald Trump has overseen a chaotic response. The country has had a dire shortage of testing kits, so its government does not know how many people have had the disease. President Trump also repeatedly downplayed the dangers of the disease – although despite what you may have read he did not (quite) call it a hoax. He also incorrectly compared it to seasonal flu, and falsely claimed the US response was more comprehensive than any other country's.

How did two of the most advanced countries in the world, with technology and expertise to spare, fail to recognise the crisis as it unfolded? A final answer will only come with hindsight and public inquiries, but there are many known psychological processes that cause individuals and organisations to miss the signs of a coming emergency – even when it is staring them in the face.

You might also like:

In 1980, psychologist Neil Weinstein published the first study of what has come to be known as “optimism bias”. He found that people are “unrealistically optimistic” about their own future prospects.

Weinstein asked over 200 students to rate their chances of experiencing different life events: either positive things like owning their own home or having a gifted child, or negative things like developing cancer or getting divorced. The students also rated the chances of other people in the group experiencing the same events.

The lockdown in major cities has been stark and swift – partly because it took so long for some governments to act (Credit: Getty Images)

The lockdown in major cities has been stark and swift – partly because it took so long for some governments to act (Credit: Getty Images)

Most of the students thought they had better than average prospects, for example saying they were less likely to get cancer than everyone else, and more likely to own their own homes.

“That’s been known and demonstrated in many different ways,” says Tali Sharot of University College London in the UK.

Sharot says the root of the bias may be the way we learn new information. In a 2011 study, her team found that people are quicker to update their beliefs in response to information that is better than expected, compared to information that is worse than expected.

There is already evidence that the bias is at work, possibly explaining why so many people have failed to adopt precautions like social distancing

It is easy to imagine how the optimism bias could affect our beliefs about Covid-19. If experts were to say that the lockdown would be eased in two weeks, people would quickly update their beliefs, says Sharot. But if the experts instead said it would last for longer than promised, people would update their beliefs less. “They say ‘I don’t really believe it’, ‘things change’, and so on,” she says. “As a consequence, you then generate these biased beliefs.”

Indeed, there is already evidence that the bias is at work, possibly explaining why so many people have failed to adopt precautions like social distancing.

For example, in a study that has not yet been peer-reviewed, Toby Wise of University College London in the UK and his colleagues surveyed 1,591 Americans on their beliefs and actions regarding the virus. While the volunteers’ awareness grew over time and they started taking protective measures, they underestimated their personal risk of infection, relative to the average person.

Good fortune can sometimes create problems - if two airliners narrowly miss each other, we could underestimate how close we came to disaster (Credit: Getty Images)

Good fortune can sometimes create problems - if two airliners narrowly miss each other, we could underestimate how close we came to disaster (Credit: Getty Images)

Similarly, Benjamin Kuper-Smith of the University Medical Center Hamburg-Eppendorf and his colleagues surveyed people in the UK, US and Germany. Their volunteers not only underestimated their risk of getting infected, they lowballed the chances of them passing the virus to others.

“We are now conducting a large study, and our pilot data shows the same thing,” says Sharot. In her pilot, “not a single person said they were more likely to get the virus”.

People are also susceptible to a subtler mistake, dubbed “outcome bias”.

“A very obvious example is if you have two airplanes that nearly collide, but don’t,” says Robin Dillon-Merrill at Georgetown University in Washington DC. She says one possible response is, “Wow, that was really close, next time that could have happened” – which might prompt changes to current practice. However, often people do not respond like that. Instead they say, “Wow, I’m a fabulous pilot and my flight skills avoided that entirely”. This is outcome bias: the fact things turned out OK can cause us to underestimate how close they came to going badly wrong.

The Covid-19 epidemic is a clear instance of governments and organisations not having learned from near misses

Outcome bias was described by Jonathan Baron and John Hershey in 1988. They gave volunteers descriptions of decisions other people had taken in uncertain situations, such as gambling. The volunteers were asked to rate the other people’s decision-making and reasoning. They rated the other people more highly if the outcomes were favourable – even though chance played a large role in the outcomes. In other words, the fact the decisions happened to work out caused the volunteers to overrate the reasoning that went into making them.

The Covid-19 epidemic is a clear instance of governments and organisations not having learned from near misses. In the past 20 years there have been two outbreaks of diseases caused by coronaviruses, the group to which the new virus belongs. The Sars outbreak of 2003 and 2004 killed at least 774 people before it was contained, while the ongoing Mers outbreak which began in 2012 has killed 858. Covid-19 has already far surpassed both, at more than 76,500 deaths at the time of writing.

“I don’t think that we’re experiencing anything like a near-miss at the moment, unfortunately,” says Dillon-Merrill. “This is not a near-miss, this is an absolute hit.”

The 2003 Sars outbreak was quickly recognised as a serious danger to health (Credit: Getty Images)

The 2003 Sars outbreak was quickly recognised as a serious danger to health (Credit: Getty Images)

Even if people are presented with clear evidence that a crisis is unfolding, they may deny the reality of it. Many psychological factors contribute to denial, but a crucial one is confirmation bias. If people are motivated to believe something, they may only seek out evidence which supports that point of view, and ignore or dismiss anything that contradicts it.

Dillon-Merill points to a recent story from the Los Angeles Times. On 8 March, a woman celebrated her 70th birthday with a party at the Trump National Golf Club in southern California. A week later it emerged that one partygoer had tested positive for Covid-19. Many other attendees soon tested positive.

Guests had been advised not to attend if they were ill, but that was not enough. “A lot of people who are transferring the virus don’t have symptoms,” says Dillon-Merrill. But everyone rationalised that away. “With the confirmation bias, you find the data that supports your position,” she says. “What you really want to see is: ‘I’m healthy. I really want to have my party’. We assume nobody’s showing symptoms and nobody’s coughing on anybody else, why can’t we have our party?”

Our tendency to conform can be beneficial, but in this case it is hurting us

The story also illustrates another problem: in uncertain situations, we look to each other for guidance, but our neighbours are not always the best guides. “I think a very strong influence of all of this is social norms,” says Dillon-Merill. “Because the information is not clear, it’s changing or it’s uncertain, people are looking for clues and cues, and are tending to do what they see is the social norm.”

This may explain so-called panic buying of unnecessary items like bottled water. If you see other stocking up on bottled water, you may do it too. Our tendency to conform can be beneficial, but in this case it is hurting us.

At the level of government and other big organisations, this tendency to conformity can manifest as “groupthink”. Intelligent and experienced decision-makers sometimes stop discussing the various options openly and instead uncritically accept whatever plan they think everyone else is settling on.

The 1962 Cuban Missile Crisis came after the "groupthink" disaster of the Bay of Pigs, where advisers came up with plans of action  without clear leadership (Credit: Getty Images)

The 1962 Cuban Missile Crisis came after the "groupthink" disaster of the Bay of Pigs, where advisers came up with plans of action without clear leadership (Credit: Getty Images)

Groupthink was first described by psychologist Irving Janis in the early 1970s, most notably in his book Victims of Groupthink. Janis studied President John F Kennedy’s decision-making in two international incidents: the failed Bay of Pigs invasion of Cuba in 1961, and the Cuban Missile Crisis of 1962. The Bay of Pigs was a major US foreign policy failure, and Janis found that Kennedy’s advisers were reluctant to contradict him or each other. “With the Cuban Missile Crisis, a lot more of the meetings happened without him in the room, where they were forced to come up with alternative ideas, and that forced people to weigh the pros and cons of ideas,” says Dillon-Merrill. “The simplest idea to overcome groupthink is a better-structured process for making decisions.”

There is a related concept called “functional stupidity”, described by Mats Alvesson at Lund University in Sweden and Andre Spicer at City University of London in the UK. The pair found that organisations often hire clever and talented people, but then create cultures and decision-making processes that do not encourage them to raise concerns or make suggestions. Instead, everyone is encouraged to emphasise positive interpretations of events, leading to “self-reinforcing stupidity”.

The good news is that it is possible to overcome all these biases, but it requires changing the way many organisations work.

Crises do not come out of nowhere, but are the most extreme versions of things that happen all the time

Let’s start with the optimism bias, which arises because we learn better from good news than from bad. Sharot has found that if we are stressed, this reverses and we become better at learning from unexpectedly bad developments. The implication is that a bit of stress at work is good, because it will help us learn about threats and thus mitigate them.

As for the outcome bias, which causes us to dismiss near misses, Dillon-Merrill and her colleagues have found that having an organisational culture that emphasises safety can make all the difference. This is easiest when the dangers of the near miss are obvious, for instance when a plane’s engine catches fire. It is harder if the near-miss is a subtle one, but a well-run organisation can still learn from such near misses.

The structure and culture of an organisation are key, says Kathleen Sutcliffe at Johns Hopkins University in Baltimore, Maryland. Crises do not come out of nowhere, but are the most extreme versions of things that happen all the time, so it is possible to anticipate them in outline and build up resilience.

If we see people panic buying goods - like bottled water, for instance - it can force us to behave similarly (Credit: Getty Images)

If we see people panic buying goods - like bottled water, for instance - it can force us to behave similarly (Credit: Getty Images)

Sutcliffe and her colleagues have identified five characteristics of the best-prepared “high-reliability” organisations, which rarely experience disasters.

First, such organisations are “preoccupied with failure”, says Sutcliffe. “What I mean by that is they understood what they wanted to achieve, but they also thought a lot about the ways in which they could get sidetracked and the ways in which things could go wrong.” This includes taking near misses seriously. “When you say ‘preoccupied with failure’, people jump to the conclusion that you’re not very positive and can’t celebrate successes. That’s not at all what we’re saying,” she emphasises.

High-reliability organisations also encourage their employees to avoid simplification and embrace complexity, even if that means abandoning appealing positive narratives. They spend most of their time focusing on the here and now, rather than on big-picture strategy. They build resilience, mostly by ensuring that their staff have the time and encouragement to tackle problems rather than sweeping them under the carpet.

And finally, they have flexible decision-making structures, meaning decisions can variously be made by low-ranking people on the ground and upper management, depending on the nature of the crisis.

If governments want to avoid being caught flat-footed by the next pandemic-level crisis, they might do well to take some of Sutcliffe’s advice. Otherwise their human biases may betray them – again.

--

If you liked this story, sign up for the weekly bbc.com features newsletter, called “The Essential List”. A handpicked selection of stories from BBC Future, Culture, Worklife, and Travel, delivered to your inbox every Friday.

;