Could replacing LinkedIn profile pictures with dog photos help us be more objective?

Unsurprising statement: hiring good people is hard. Speaking from experience, I can attest to the difficulty of filling empty seats with people who I believe are qualified or a good cultural fit.

We know that it is even harder, however, to avoid our own biases when hiring – whether they are conscious or unconscious. From a candidate’s perspective, no one likes to think of a manager or recruiter excluding them because they associate your name with someone who they don’t like, a university they did not get admitted to or, worse, because they just don’t like your profile photograph.

Aaron Weyenberg, a New York City-based director of research and development at not-for-profit TED, needed a way to review candidates without the effects of unconscious bias.

So, he turned everyone into dogs.

Weyenberg cheekily launched Profile of Dogs, a Chrome browser extension that automatically turns users’ self-selected LinkedIn profile pictures into a random dog image. (If the extension – installed via Chrome’s web store – sounds vaguely familiar, that might be because it is named after the Wes Anderson film Isle of Dogs.)

LinkedIn is a substantial part of the recruiting process in many countries and although many companies still ask for a CV and cover letter, candidates are often pre-screened on LinkedIn before they are contacted.

Profile of Dogs is a Chrome browser extension that automatically turns users’ self-selected LinkedIn profile pictures into a random dog image

“All kinds of information that has nothing to do with a person's qualifications can be involuntarily placed in our line of sight,” says Weyenberg. “The rise of LinkedIn certainly hasn't done anything to slow that. They don't offer any kind of browsing mode that suppresses irrelevant information and puts forward more substantive information.”

Someone, he said, had to do it for them.

Can we level the playing field on LinkedIn by turning ourselves into randomly-generated photos of dogs?

Can we level the playing field on LinkedIn by turning ourselves into randomly-generated photos of dogs?

Aware of the unconscious face bias he and his team members might harbour, Weyenberg’s current hiring policy includes conducting first-round interviews with candidates via audio-only telephone calls. Still, he says, LinkedIn “often finds its way into the process before that. And what that does is expose to me to information I actually don't want and doesn't help me – like their appearance (and thereby their approximate age), name, et cetera.”

Weyenberg calls Profile of Dogs a “sort of Trojan horse” – something fun that concurrently addresses a real issue.

Poodles versus pit bulls

The problem is that people make associations with all kinds of things, says Alexander Todorov, assistant chair of psychology at Princeton University. Even dogs.

“Dobermans have one kind of reputation that is different than a golden retriever,” he says. “If I see someone with a picture of, say, a pit bull, there are specific inferences that come to my mind for this person – even if they could be completely false.”

Plus, our reactions to faces are different compared to other identifying signals. We are much less aware of presumptions we may make on the basis of someone’s looks. People are often aware that they may have certain stereotypes around well-discussed issues like race and sexual orientation. We will often work to correct for those biases, Todorov says. But with faces, “there’s no inclination to correct, and most of these biases will go undetected”.

This may still be the case even though we know a profile picture is randomly generated. Randomness may help reduce bias, Todorov says, but you can’t avoid the signals you interpret unconsciously. “We certainly have rich stereotypes about [different breeds of] dogs.”

In short, there is a likelihood you could make a snap judgement about someone’s trustworthiness with a randomly generated dog image, too. “This will dominate judgement, even if you have other useful information on the screen,” he says. “The mind is a big associative machine.”

Of course, Weyenberg is not the only person trying to eliminate unconscious bias. Many recruiters and even software companies are experimenting with removing identifying information, including names and graduation years, from applications to ensure more parity.

Still, nearly anything can trigger bias.

What people don’t understand is that wanting to be fair doesn’t make you fair – Sharon Jones

“Obscuring faces is only the first step. Many resumes contain other clues which allow bias to impact the hiring decision. For example, certain schools are associated with higher socioeconomic status and that allows for bias based on socioeconomic status,” says Sharon E Jones, CEO of global inclusion consulting firm Jones Diversity. “Some of the most common sources of biases relate to names, schools and group association whether in school or professionally.”

And even if we are able to remove bias from an initial screening, there’s little way to stop it from creeping it at later stages.

“Most people believe they are fair and support a meritocracy in hiring and in the workplace,” says Jones. “What people don’t understand is that wanting to be fair doesn’t make you fair.”

Still, the extension is fun – and certainly does at least some of the job Weyenberg intended. He says his next version might replace profile names, another big source of association and bias, with common dog names. Here’s to hoping you don’t still hold a grudge from the time Spot bit you when you were a child.


To comment on this story or anything else you have seen on BBC Capital, please head over to our Facebookpage or message us on Twitter.

If you liked this story, sign up for the weekly bbc.com features newsletter called "If You Only Read 6 Things This Week". A handpicked selection of stories from BBC Future, Culture, Capital and Travel, delivered to your inbox every Friday.

Around the BBC