There are costs that accrue when we are intellectually dishonest with ourselves.
How do you “read” a CEO to know what he’s thinking? It turns out not to be as difficult as you might think — even if you’re watching the most stone-faced of chief executives.
I was sitting in the office with a chief executive who was contemplating a big acquisition. He was reviewing all sorts of reports and analyses from members of his team, as well as the recommendations of a big consultancy brought in to provide “objective” data. But I knew what he was going to do before he even announced it.
How? I just followed the data. That is, the data the CEO chose to pay attention to.
Not all data are created equally, and it turns out we constantly decide which bits of information are better than other bits. We gravitate to the point of view that fits what we’re thinking already, and ignore evidence that might challenge our assumptions about what makes sense and what doesn’t.
CEOs aren’t the only ones subject to this way of thinking. That old expression that people hear and see what they want to believe is truer than you might think.
When Law School Admissions Tests results in America deteriorated recently, the association administering the exam said the weaker scores reflected a lowering of standards at the schools. Faced with this criticism, the deans at 80 law schools fired back that the test doesn’t accurately reflect the quality of their students. Of course, the deans had no problem with the test in years past, when scores were generally higher. But instead of hearing something they’d rather not, they blamed the data.
This is no different than cherry-picking nutrition studies to justify whatever type of eating preferences you have. High salt, low salt. High fat, low fat. With information readily available to support almost any predilections we have, there’s a veritable smorgasbord of options out there, just ready for us to grab them and say, “there’s this study…”
Sometimes such data arrogance has even bigger implications.
When Hurricane Katrina devastated the New Orleans region in 2005, the US government’s Department of Homeland Security charged with monitoring the hurricane and formulating an appropriate response, made the wrong call, ignoring key topographical data.
Instead, Katrina was judged to be a hurricane, like so may others that hit Florida each summer, and therefore not worthy of extreme measures. With that mindset in place, the highly-decorated retired Marine Corps general in command of the probability and risk-assessment team proceeded to interpret the various information coming into his office in a manner fully consistent with his pre-judgment, while disregarding the rest. Reports of levees breaching were viewed as suspect and incomplete — not as factual and needing an immediate change of course for the response plan. When the general saw a report on CNN showing people in the French Quarter of New Orleans partying in the streets, he concluded, as he testified to Congress, that Katrina wasn’t going to be as bad as some had feared.
The biggest problem? Most of New Orleans is below sea level; most of Florida isn’t. And the French Quarter in New Orleans was one of the few places in the city above sea level and hence, relatively immune to the devastating effects from flooding.
The truth is that no one is immune to his kind of behaviour. CEOs, generals, deans, dieters and pretty much everyone reading this column are subject to the very human tendency of wanting to be right, and looking for a way to prove their point of view to themselves and those around them.
But we pay a price for doing so. CEOs who make bad decisions, academic leaders who ignore underlying problems, dieters who cause themselves more harm than good and generals who are ineffective in times of crisis — these are all costs that accrue when we are intellectually dishonest with ourselves.
While it’s hard to fight human nature, we are not defenceless. There are ways to fight our own data-blindness.
First, ask someone you trust, and who is unafraid to tell you what he or she really thinks, for his or her opinion. Give that person permission to be critical of your own thinking, to push you and to question why you chose to rely on the data you did.
Second, be honest with yourself. OK, maybe this isn’t realistic, but I like to believe that we are capable of recognising some of the little ways in which we choose to fool ourselves. It’s good to start by asking yourself why you may be wrong, and why you are so comfortable relying on that one study that supports your preference.
Give yourself the chance to see the world the way it is, and not the way you wish it were.
If you would like to comment on this story or anything else you have seen on BBC Capital, head over to our Facebook page or message us on Twitter.