Artificial Intelligence, Hiring, and Biases: What Your Team Needs to Know


Artificial intelligence has established a foothold in HR with employers using tools from chatbots to automated video interview platforms to drive recruiting and hiring. AI can potentially perpetuate biases and create discrimination in hiring — here’s what employers need to know.

Between 2020 and 2027, the AI market’s compound annual rate of growth will top 33% according to FortuneOpens a new window . Is artificial intelligence a game-changing addition to a recruiter’s toolbox that will allow for more efficient hiring, stronger talent management, and greater connection for remote workers? Or is it a problematic technology that threatens to introduce new, unintended biases into the recruiting process?

How Employers Use AI in Hiring

AI can emerge throughout the hiring process. Its benefits primarily hinge on increased efficiency. In the past, hiring managers had to sort manually through resumes searching for one that looked promising, while recruiters performed their own research to find candidates that fit certain benchmarks. Today, AI can handle these steps.

Tools such as LinkedIn Recruiter use AI to rank candidates based on the skills, experience, location information, and other details that they share on their LinkedIn profiles. Instead of starting from scratch in finding a potential candidate, a recruiter can use LinkedIn Recruiter to create a ranked list of professionals who are supposedly the best match for the job at hand.

Similarly, many employers now use an applicant tracking system (ATS) to automate steps from resume reviews to communicate with their candidates. These technologies automate many of the menial tasks associated with recruiting and hiring, streamlining the process and saving time and effort so that recruiters and hiring managers can place their focus elsewhere.

Learn More: Can AI in Recruitment Transform Hiring in a Post-Pandemic World? Q&A With DaXtra Technologies

AI and the Candidate Experience During COVID-19

For candidates, 2020 represented a time of extreme uncertainty and adjustment. According to Workest, 17% of respondents and 50% of hiring managers reported requesting a virtual interview based on pandemic-related concerns. Rather than creating greater feelings of alienation, virtual interviews may be a way for employers to reduce candidate stress: 16% fewer candidates reported feeling stressed during virtual interviews compared to in-person interviews.

AI can ensure that these prospective hires receive seamless and safe communication from employers during an uncertain time, starting from their very first point of contact. Thanks to chatbots, automated email technologies, and other tools, AI can keep candidates updated about the progress of the hiring process and answer their questions from a safe distance.

A Growing Trend

How common is AI in the hiring process? According to Modern HireOpens a new window , 51 percent of employers are “already using artificial intelligence in recruiting.” Those uses include not just screening prospective hires but also sourcing candidates or scheduling parts of the interview process. Those findings differ slightly from a 2018 LinkedIn surveyOpens a new window , which found that 67 percent of recruiters were not only already using AI but also saving time thanks to the technology.

Both surveys show that most employers believe AI is delivering significant benefits. Of the respondents who told Modern Hire that they were using AI in their hiring processes, 87 percent believed that AI “improves the candidate experience,” thanks to its ability to increase the “speed, simplicity, and convenience” of hiring.

While cost is a critical consideration for businesses planning to scale machine learning, in 2020, 79% of AI adopters reported increased business revenue thanks to AI-assisted improvements in marketing and sales.

Impact on Remote Work During COVID-19

During a time of increased distancing and decreased facetime with both existing and prospective hires, AI can help employers connect with their workforces.

Employers can take advantage of existing tools to safely connect with new and existing hires on their own terms. In 2020Opens a new window , 4.2 billion voice assistants were in use — 22% of users preferred speaking with voice assistants to typing. 26% used voice assistants to remotely connect with other systems. Employers can use chatbots and voice assistant technology to easily link their workforces to resources and ongoing communication that are accessible from anywhere as remote work remains the norm.

ForbesOpens a new window identified several other critical ways that AI technology could have a positive impact on pandemic-linked remote work for new and existing employees, including tracking candidate and employee engagement levels from afar and recommending learning and development opportunities to foster employee loyalty and satisfaction during a time of increased isolation and stress.

Learn More: Can Remote Work Stand the Test of Time?

The Downside of AI: Bias in Hiring

Despite these benefits, AI in hiring has received substantial criticism for inadvertently introducing bias into the hiring process.

In 2015, Amazon stopped using an AI recruiting toolOpens a new window after discovering that the software was biased in favor of male applicants. While the tool itself wasn’t discriminatory, the criteria that it was using to judge candidates were perpetuating the tech industry’s existing gender imbalance.

In 2019, video interview platform HireVue came under fire for technology that the company claimed could predict how likely a candidate was to be successful in a jobOpens a new window . The Electronic Privacy Information Center (EPIC) filed a complaint with the Fair Trade Commission against HireVue arguing that the company’s AI-powered candidate assessments were “biased, unprovable, and not replicable” and could perpetuate biases related to race, gender, sexual orientation, and other categories.

Bias and discrimination during hiring and in the workplace are illegal. Title VII of the Civil Rights Act of 1964 bars discrimination in employment on the basis of sex, race, color, national origin, or religion, and the Equal Employment Opportunity Commission (EEOC) is a federal agency dedicated to enforcing these laws. The Age Discrimination in Employment Act of 1967 added similar protections against employment discrimination based on age. Title I of the Americans with Disabilities Act of 1990 prohibits employment discrimination based on disability.

Despite these protections, statistics indicate that workplace discrimination is rampant. A 2019 survey conducted by GlassdoorOpens a new window found that 61 percent of workers in the United States had “witnessed or experienced discrimination based on age, gender, race, or LGBTQ status in the workplace.”

Learn More: How to Overcome Hiring Bias to Tap Hidden Talent

AI Expectations vs. Reality

In some circles, professionals consider AI to be a means of removing bias from the hiring process. More than 50 percent of respondents to the Modern Hire survey believed that AI would “help reduce bias and the risk of discrimination in the hiring process.” In the LinkedIn survey, 43 percent of respondents said that AI was helping them “remove human bias” from the hiring process.

In theory, AI could remove bias from the hiring process by withholding certain information from hiring managers. Automated video interview technology, for example, has the potential to enable “blind interviews.”

In this scenario, an employer would pre-set or pre-record video interview questions for which a candidate would record answers on their own time. An AI platform could convert the candidate’s responses to text, removing identifying information about that candidate—including gender, skin color, and more—from the equation until the hiring manager had reviewed the responses and decided whether to advance the candidate to later stages of the interview process.

The problem is that AI is programmed by humans, which means that it can reflect human biases. At Amazon, the AI software delivered gender-biased results because it was working from criteria that reflected the norms of a gender-imbalanced industry. This type of bias, called “data bias,” is a common risk with AI hiring tools. Employers may overlook the risk of data bias because they expect AI to help them eliminate bias in hiring–not add to it.

Where AI Fits in Hiring

In 2020, 11% of mainstream companies reportedOpens a new window falling behind in AI adoption as a top concern. 28% plan to invest in AI or machine learning tools this year. How can employers avoid data bias in hiring without sacrificing the potential benefits of AI-driven hiring—particularly as the COVID-19 pandemic continues to disrupt workplace planning and HR?

One option is to continue to use blind interview strategies to leave bias-generating information outside of a hiring manager’s awareness until they have had a chance to assess a candidate based on their skills, abilities, responses, and other objective information.

Another option is to be sparing with AI use: implement chatbots or other technologies that allow you to communicate and stay engaged with candidates, but don’t leave AI fully responsible for identifying which candidates are worthwhile.

Overall, the most critical factor for employers to embrace is balance. AI can be a powerful, time-saving tool in hiring, but without responsible oversight, it can also perpetuate biases unchecked. Finding the right balance between the benefits of AI and the irreplaceable critical thinking skills of recruiters and hiring managers will deliver the best results.