Today, there are few arguments that can be made against having a diverse, multicultural workforce. Women are as good as men, disabilities shouldn’t hold employees back and race has no impact on a person’s ability to perform a job.
Indeed, research has shown that diverse groups can be better at making decisions than homogenous ones. Management consulting firm McKinsey has even shown for example, that companies that rank high on gender, racial or ethnic diversity benefit from financial returns that are above national industry medians.
Despite this, minorities and members of underrepresented communities continue to be discriminated against when it comes to hiring.
But, in many cases, employers may not even be aware of how unintended biases influence who they hire. Here we look at where these biases can come from, how they creep into the hiring process and some strategies for fighting them.
The ‘right candidate’
When looking for a new member of staff, employers will often draw up a list of the qualities they are looking for in potential candidates. This is a key trap where biases can creep in. Whether employers realise it or not, they often use the last person to hold a post as a benchmark for the type of candidate they’re looking for.
“In some ways people who have been successful in roles in the past become a prototype or reference point for success in our minds,” says Siri Uotila, a research fellow at the Women and Public Policy Program at the Harvard Kennedy School. “If the majority have been white men, then it is more difficult to actively diversify the population."
Employers often use the last person to hold a post as a benchmark for the type of candidate they’re looking for
This is perhaps a result of the tendency for many organisations to be risk-averse. Finding a candidate who is like the previous employee can seem less risky, explains Uotila. It is a mindset that can also lead employers into another trap – looking for someone who is like ourselves.
“We call that in-group bias or homophily,” says Uotila. “Similarity breeds connection, so we are inherently drawn to people like us.”
Instead, she suggests tapping into the networks of diverse employees through their LinkedIn or other online or in-person networks, though this may not be possible at homogenous organisations. Luckily, there are other ways to avoid bias and improve diversity throughout the hiring process.
The wanted ad
With limited time and resources, managers may be tempted to recycle old job descriptions. But research suggests they should take a careful look at what they are asking for – the language in job adverts affects the type of people who apply.
“It turns out that language is gendered,” says Uotila. “There are words that we perceive to be female-typed or male-typed.” Words perceived to be female-oriented include “warm,” “collaborate,” and “team”, while ones perceived to be male-oriented include “leader,” “aggressive,” and “ninja”.
“If you write a job application for a coder and describe what you’re looking for as ‘aggressive ninjas,’ that language is going to disproportionately draw men to apply for the job and limit your pool,” explains Uotila.
The language in job adverts affects the type of people who apply
One Canadian study found that job descriptions with male-typed words were less appealing to women, who sensed that more men worked at the company and were concerned that they wouldn’t fit in. Online job board Totaljobs analysed more than 75,000 job adverts for gender bias and found that adverts for industries like science and marketing were biased toward men, while those for industries like education and customer service were biased toward women.
Since it’s impractical to use only gender-neutral words, a more effective strategy may be to balance out male-coded and female-coded words, according to Uotila. It might be more effective to say something about individual responsibilities alongside something about teamwork, for example.
Uotila says some organisations have seen success by broadening the talent pool rather than focusing on demographic diversity directly. Technology companies that look exclusively for computer science majors, for example, could re-evaluate their needs and recruit candidates with biology, neuroscience or psychology majors, which have more diverse populations.
"Often, you can expand the skills and types of background that you’re looking for,” she says. "Our advice as a rule of thumb is to minimise the number of specific requirements in your job ad."
It’s no secret that as CVs pour in, hiring managers often turn to computers to help them sort through candidates. While many programs can remove names, gender, and other types of demographic data, the growing use of machine learning algorithms to help identify potential employees is leading to new types of bias in the recruitment process. These sorts of computer algorithms are only as good as the data they are trained on and can pick up biases hidden within these data sets.
The growing use of machine learning algorithms to help identify potential employees is leading to new types of bias
“Machine learning algorithms are often trained using historical data from the company,” says Sorelle Friedler, an assistant professor of computer science at Haverford College in Pennsylvania, who studies fairness in machine learning. “There’s the possibility that algorithms can essentially pick up on patterns that the employer might not even want but that are present in their existing set of employees.”
Though much of the responsibility lies with the developers of the algorithms, Friedler suggests that companies insist on fair products. She hopes algorithms will soon start to have more transparency, offering explanations of the logic behind their recommendations. It will then be up to recruiters to double-check the machines.
While phone interviews are “blind” because you can’t see the candidate, research has shown that people can still infer information about a person’s race and other demographic factors, such as socioeconomic status, from the sound of their voice.
One recent study by Faye Cocchiara, an instructor of management atArkansas State University, found that prospective employers categorise job applicants using these sociolinguistic cues.
“We were interested in what would happen to a person who sounded black and actually wasn’t,” she explains. “About 89% of people were able to classify the people’s voices as black.” The research found that whether the voice came from a black person or not, those who were classified as being black purely on the sound the voice were subjected to more negative judgements.
Some people say they have no bias against anything, and that’s laughable - Faye Cocchiara
“We can also determine someone’s class by the way someone speaks,” Cocchiara says. “When we hear someone who has a European accent for example, we think and believe certain things about them, which can cloud our judgement."
Instead, Cocchiara recommends that recruiters understand what their implicit biases are, by taking an implicit associations test.
“Some people say they have no bias against anything, and that’s laughable. I’d rather someone say I have these biases, but I understand them and don’t act on them.”
Of course, at some point, selected candidates will come into the office for a face-to-face interview. It is well known that first impressions count and several studies have shown that appearance plays a key role in candidate selections. One study found, for example, that being attractive was an advantage for people with mediocre CVs.
Another study from the UK indicates that if you are considered to be overweight or obese, you could find it more difficult to get a job. Participants were asked to evaluate the suitability of job applicants whose pictures were attached to a CV.
Some people say they have no bias against anything, and that’s laughable - Faye Cocchiara
“What we saw was that suitability of participants was lower for people applying that were obese, versus people who were normal weight or whose weight wasn’t revealed,” says Stuart Flint, senior research fellow in public health and obesity at Leeds Beckett University and an author on the study. This bias was compounded if the candidate fit into multiple categories that are often discriminated against. Obese women, for example, were considered even less suitable for the job than obese men.
“People who are overweight are considered lazy and less intelligent – stereotypes that are unfounded,” says Flint. “There’s no evidence they are less able to do the work or are less intelligent than an average-weight person.”
Workplaces that provide perks like gym memberships can also accidentally promote the belief among their employees that individuals are to blame for being overweight, according to Flint. “We know there are many factors that contribute to weight status, many of which are outside your control.”
He hopes that educating recruiters to be aware of these implicit biases may help to reduce the Impact this can have.
In some cases, it’s also beneficial to anonymise the candidate, even in person. A study of American symphony musicians found that when screens were used to conceal the identity of the musician during auditions, the chances a woman would advance to the next round of selection increased by 50%.
While setting up a physical screen during an interview may not be practical in the usual office setting, experts say there are other ways that can reduce bias in the interview process.
Uotila believes structured interviews are critical – questions should be determined in advance, asked in the same order, and scoring should be completed as quickly as possible.
“Research has shown that unstructured interviews are one of the least predictive hiring tools in terms of correlating to the candidate's ultimate success in the job,” she says. “They are a ripe breeding ground for lots of different types of biases, including something called the halo effect. If a candidate does something that I like in the first couple minutes of the interview, that clouds my judgment of everything they say and do after that point. I may view them too positively and I won’t be able to evaluate answers objectively.”
Setting criteria in advance has also been shown to mitigate the influence of biases. In one study, people were asked to evaluate the CVs of male and female candidates for the job of police chief, but the researchers manipulated the resumes so that candidates had different credentials – they were either more streetwise or educated. In their evaluations, people changed their criteria to match the resumes of the male candidates.
However, when evaluators had to rate the importance of streetsmarts versus education before they learned the candidate’s gender, they gave fairer evaluations.
Alexandra van Geen, a consultant based in the Netherlands who specialises in improving diversity at companies, has shown in her own research that joint evaluations, in which multiple candidates are interviewed at the same time, can reduce this type of bias.
“We found it could help recruiters override their first response, which are sensitive to our biases and heuristics,” she says. “Whenever you have to evaluate one person and you don’t have any comparable material, it’s very subjective.” That’s harder to do when you have to compare two candidates side by side.
Even once a candidate has been selected, bias can still creep into recruitment. Organisations obviously have an incentive to pay their employees as little as it will take for them to still do a good job. On the other hand, they should want to steer clear of gendered pay gaps.
“The traditional view is that women are generally less inclined than men to negotiate salaries,” says Andreas Leibbrandt, a professor of economics at Monash University in Melbourne, Australia, who has studied salary negotiations. “We observe this is not true. In an environment where the employer signals that salary negotiations are OK, women are equally likely to negotiate. In contrast, if there is ambiguity about the extent to which salaries are negotiable, women are significantly less likely to negotiate.”
This may be because women who negotiate on their own behalf are socially penalised and perceived as overly demanding. To avoid unfair differences in wages, Leibbrandt suggests organisations audit themselves.
“To reduce bias, organisations need to first detect where the bias kicks in, how it exactly manifests, and how it can be undone,” adds Leibbrant.
To comment on this story or anything else you have seen on BBC Capital, please head over to our Facebook page or message us on Twitter.
If you liked this story, sign up for the weekly bbc.com features newsletter called "If You Only Read 6 Things This Week". A handpicked selection of stories from BBC Future, Culture, Capital and Travel, delivered to your inbox every Friday.