GetApp Lab
X

Hiring bias: why your recruitment practices may be unfair and how algorithms can help

US companies have a diversity problem. Only 14 percent of CIOs are female. At Microsoft, the representation of female employees even declined by 1 percent in 2016, despite the company’s diversity program. Meanwhile, only 2 percent of tech executives are black and 3 percent are Latino.

The question is, how much of this can be traced back to a (conscious or unconscious) hiring bias in companies’ recruitment processes?

Brittany King, senior recruiter and founder of The Career Collective, believes that bias is almost always present in the recruitment industry.

“As a senior recruiter who has worked in the industry for a long time, I have witnessed overt and subtle bias in the recruitment process,” says King.

Elad Shmilovich, co-founder and CMO of diversity enablement software Joonko, agrees that hiring practices are biased. He says:

“It’s the same as any business process that is lead by humans – because eventually if you’re human, you’re (unconsciously) biased. That’s how our brain works. Our brain aims to make the easiest and most effortless decision, based on what consider familiar and safe. That’s why, for example, under time/work pressure – recruiters will favour candidates who are either similar to them or considered “better” by social patterns.”

The research is there to support this, with a quarter of respondents in a recent survey believing all companies have biased recruiting practices, while a third think most organizations do.

Why are hiring practices biased?

Bias can be unconscious, as Shmilovich mentions, but it can also arise due to the way recruitment is handled either by internal hiring managers, or by external firms.

Will Hunsinger, COO of recruiting services firm Riviera, believes that bias can arise due to the speed and scalability often required in the hiring process, a well as the wealth of information and profiles that recruiters have to sort through.

He explains:

“In the interest of sanity, recruiters naturally start begin shaping their own elimination bias simply in response to the volume of information they face; they sort on face value on static information sources (e.g. good school, top companies, self-selected skillsets, etc.) and look past the deeper values of comprehension around concepts and constructs.

“You constantly hear about a “war for talent,” but the truth is that plenty of talent exists, but little effort is put into evaluating the greater spread outside of traditional norms.”

Analysis from research firm Gartner (content available to Gartner clients) says that the way you reward recruiters and external recruitment firms can also lead to bias: “If the objectives include speed and monetary compensation based on the level of candidate they send to you, the results will be skewed toward older male candidates from the dominant local ethnic group. Why? Because candidates like that match expectations and we are more likely to hire “people like us” due to unconscious bias. Older workers are generally more highly paid and so maximize the reward for the recruitment firm.”

Diversity in different forms

In the same research, Gartner lists three types of diversity in the workforce:

Studies have found evidence of bias in all three of these diversity areas in the recruitment process. Addressing legacy bias, résumés with old-sounding names or listing activities that are perceived as old-fashioned, are rated as less suitable for the job compared to identical resumes with modern-sounding names.

Jon-Mark Sabel, content marketing strategist at recruitment software HireVue, says:

“Asian-sounding names receive 20 percent fewer callbacks, according to this (very) recent study,And of course, there is also the infamous study where “Emily” and “Greg” were 50 percent more likely to be called back than “Lakisha” and “Jamal.” Were the recruiters in this situation racist? Probably not – but they did have preconceived notions about the way a person’s name is related to their ability to perform.”

Experiential bias in the recruitment process often shows up in the way that companies screen for economic background and the schools that candidates attended. For example, research from Lauren Rivera, an associate professor at the Kellogg School, found that applicants’ socioeconomic backgrounds plays a large role in the hiring decisions of elite professional services firms.

Harj Taggar, CEO of recruitment software Triplebyte, says:

“We regularly see companies move faster to recruit engineering candidates who have attended elite schools like Stanford, over other candidates with similar levels of experience from less well known schools.”

Thought bias shows up in a wide variety of ways in the recruitment process, including the bandwagon effect, and confirmation and blind-spot bias. King believes that talent acquisition professionals use cognitive bias in their recruitment and hiring processes without even realizing it.

“This is often due to a variety of factors including the “halo effect”,” she says. “The halo effect is a type of cognitive bias that occurs when our overall perception of someone influences our feelings toward them. For example, we may interpret someone to be friendly and professional and immediately equate that with intellect or the ability to do a job. The problem here is that our interpretations often have no bearings on someone’s ability or fit for a position.”

Hiring bias recommendations

To try to stamp out this bias, Gartner recommends: “Do a root cause analysis of the factors that are preventing your IT department or technology company from hiring more-diverse candidates, paying special attention to the recruiting process and recruiting firms you use.”

More objective ways to screen and interview candidates are also needed, with blind recruiting and an end to using CVs to make hiring decisions two important options.

“A blind hiring technique such as anonymizing resumes allows recruiters and hiring managers to be more objective when evaluating a candidate’s skills, knowledge, and potential to succeed, free from unconscious biases of the candidate’s race, gender, age, and education level,” says Ji-A Min, head data scientist at hiring app Ideal.

Researchers from Harvard and Prince­ton found that blind recruiting increased the likelihood that a woman would be hired by between 25 and 46 percent.

Sabel believes that to get rid of bias entirely, you need to get rid of the resume – or at least give it much less weight.

“GPA tends to be a rather poor predictor of job performance, and employment gaps are often misconstrued as laziness,” he says. “For example, Frontier found that long-term unemployed new hires stuck around just as long as their recently employed counterparts, and were promoted more often.”

Are algorithms the answer?

Instead of using CVs or traditional recruiting methods to assess candidates and make hires, companies are now turning to algorithms.

Analysis from HBR shows that an equation outperforms human decisions by at least 25 percent when it comes to evaluation information provided by candidates when it comes to recruitment, whether the job is entry, middle management on in the C-suite.

Gartner says that algorithms can: “transform recruitment processes by replacing reliance on both recruiters’ intuition and automated résumé evaluation based on word matching with insights gleaned over time from analysis of large datasets. This will help hiring managers hire the best staff.”

Sabel says:

“An algorithm doesn’t give a damn what your name is. It doesn’t care about your gender or skin tone. As a result, algorithms remove implicit (unconscious) bias.”

King believes that algorithms are more effective because the human element of perceptions is removed.

“Some may assert that “blind recruiting”  isn’t a smart move, but I think that it would only be beneficial if implemented,” she says. “I believe that in hiring, algorithms should be utilized to help ensure that the playing field for candidates is level.”

However, Riviera’s Hunsinger has some cautionary words for companies considering using algorithms and machine learning in the recruitment process.

“This approach has the potential risk to enhance or amplify the “bias” in the hiring process,” says Hunsinger. “For example, if the learning algorithm senses a firm avoids selection of female applicants, it might try to only surface male candidates to facilitate the fastest conversion. The human touch needs to be involved to analyze the impact of the learning policy and outcome to offset the “bias” that are against principles of recruiting and employment in general.”

Algorithms in recruitment software

There are already a variety of different recruitment software available that use algorithms. Here are some of our picks that can help redress recruitment bias:

Blendoor

Blendoor, which has been described as “Tinder for hiring”, hides applicants’ information from companies that may lead to bias, such as photos, name, gender, age, and race. It matches based on skills, and suggests learning and development courses if candidates are lacking some core skills that may make them miss out on jobs. The app analyzes candidate and recruiting behavior, helping companies find any pain points when it comes to diverse recruiting.

GapJumpers

GapJumps takes its inspiration from TV show The Voice by setting what it calls “blind auditions”. These are performance challenges that potential employees take to showcase their skills, instead of being assessed via a CV.

HireVue

HireVue dispenses with traditional resumes, and replaces them with digital assessments, on-demand video interviews, and digital interviews to help hire the best talent, faster.

“When video interviewing is involved, AI and data science removes bias entirely,” says Sabel. “By tying over 20,000 visual and audio data points to the performance data of known top performers, we can identify future top performers with astounding accuracy.”

Ideal

Ideal automates the procedure of screening resumes using an artificial intelligence-powered algorithm. It also uses AI to find candidates from previous applications to other roles, and to perform outreach to potential employees. It then shortlists relevant candidates.

Joonko

Joonko provides insights and recommendations whenever an event of potential unconscious bias may happen to allow the recruiters to take real-time corrective actions by following the solution’s suggestions. Suggestions and recommendations vary according to previous successes, who the recruiters are, the possible issues, and so on.

Riviera

Riviera, a recruiting services firm, collects data, tags and structures relevant information, and then uses this to build candidate profiles.

“AI can infer candidate quality based on these data streams and benchmark them against our unique vertical – our assessments include skill parsing from resumes, workplace experience and tech-stack histories, domain and industry experience, and perceived career trajectory,” says Hunsinger.

Untapt

Aimed at hiring managers in the financial services sector, Untapt uses machine learning and an algorithm to match technical candidates with jobs and to provide a list of these candidates to companies.

Triplebyte

Triplebyte provides online coding tests to assess technical skills, and then matches technical staff with the right companies. The software screens candidates, then the company pitches themselves to candidates before Triplebyte sends the best fit onsite for interviews.

“We helped Flexport, a company that is redefining the shipping industry, hire an engineer who was in Australia and just graduated college there and also a graduate from a coding bootcamp,” says Taggar. “These hires would never have happened if we hadn’t vouched for their skills after evaluating them using our credentials blind process.”

What are your tips for reducing recruitment bias?

While studies suggest that blind recruitment and using machine learning and algorithms can help with hiring practices, the practical use is still in its infancy. The effects of this will take years to show up in terms of diversity in the workplace, and more work needs to be done.

We’d love to hear your tips on how companies can improve their hiring processes and reduce bias in the process. Let us know in the comments below or email me on karen@getapp.com.

Categories: Insights
Tags: home
Karen McCandless :Karen is a writer for the GetApp lab, as well as editor for the blog. Before working at GetApp, she spent a lot of time reviewing photo and productivity apps for Android and iOS, as well as covering all things B2B, primarily for retail and manufacturing. When not writing about B2B apps, she enjoys trips to the theater, playing badminton, and working out ways to travel more.