2 Ways To Avoid Pitfalls of AI Usage in Tech Hiring Processes

essidsolutions

Since the start of the pandemic, a huge number of businesses have moved their interview processes online. This change has introduced more technology solutions to the digital interview processes, such as the increased use of artificial intelligence (AI) and other automated tools to identify the best candidates.

One use case on the market right now is streamlining the resume review process using AI-powered tools. The problem with this approach is that layering AI on top of current resume screens not only exacerbates pedigree bias, but it creates a black box around the vetting process and makes the bias harder to identify.

Filtering candidates based on resumes by only looking at pedigree indicators, such as a previous employer or school, is not equitable. Using pedigree as a proxy for capability, resume screens remove candidates who do not come from traditional backgrounds. Furthermore, the backgrounds that AI tools screen for are almost always already over-represented in your talent pipeline based on referrals and active recruiting initiatives.

AI bias in hiring is also becoming a political issue that talent leaders should be mindful of. Last year, New York City lawmakers targeted the problem with a new law that aims to curb hiring bias. Under the law, employers in NYC are banned from using automated employment decision tools to screen candidates unless that technology went through an audit a year before using the tool.

Other states, including Illinois and Maryland, have previously worked to address the issue. The use of AI in hiring has been on the radar of lawmakers for years after workers started filing allegations of AI-related discrimination to the U.S. Equal Employment Opportunity Commission.

Reducing Pedigree Bias

How is bias baked into AI screening tools? The AI algorithms are programmed to find patterns in historical hiring data to inform future hiring decisions. The big problem with this is that it reinforces the status quo — defining the “best” candidates as those most similar to past employees.

So what’s the solution? First, reduce bias by giving more interviews to direct applicants. On the interview side, a human + technology approach lets interviewers focus on what’s important: building a rapport with candidates, providing clarity and setting them up to show their best selves in the interview. You give candidates an interview that is predictive, fair and ultimately enjoyable, unlocking opportunities for employees and teams.

Furthermore, Karat’s industry-wide interviewing dataOpens a new window suggests that direct-applicant pools are being over-screened. Less than 10% of direct applicants for software engineering roles ever make it to the technical interview stage. But for many companies, the direct applicants who make it through pass their tech screens at comparable rates to referral and actively sourced candidates. And what’s more, the data shows that direct candidates have higher close rates than other cohorts. By loosening your resume screen requirements, not only can talent leaders reduce pedigree bias and create a more equitable process, but you also improve hiring yields and reduce sourcing costs. 

Using a Human + Tech Approach

Once you’ve ensured your screening process is mitigating bias, it’s essential to look at the interview itself, especially if you’re considering introducing any automated tools. We recommend that companies maintain a human element in any technical assessment or interview.

A human + technology approach helps ensure a fairer interview process. This approach ensures human safeguards on top of any AI implementations that may produce false negatives that reject some great potential hires. It may take some trial and error to find the right balance. For instance, you could use technology and AI to lighten the mental load of your interviewers by suggesting questions based on the role and competencies being evaluated. You could use video recordings to review interviews and train AI to look for mistakes such as preferential treatment or a lack of clarity in the way questions are being presented.

Establishing a human connection is essential for interviewing. Tests are cold and impersonal. Live interviews create a more enjoyable candidate experience and introduce more nuance than binary pass/fail tests can achieve. For instance, human interviewers can spot silly missteps like typos on coding tests that could prevent a working technical solution. Having a human administer interviews not only reduces false negatives but also has been shown to improve pass-through rates for women and candidates from underrepresented backgrounds compared to binary pass/fail code tests, according to our dataOpens a new window .  

There are several other advantages of having well-trained human interviewers. They can help put candidates at ease, clarify expectations, provide clear and transparent guidance and reduce false negatives that weed out otherwise qualified candidates.

How are you ensuring that you are avoiding AI pitfalls in the online interviewing and hiring process? Let us know on FacebookOpens a new window , TwitterOpens a new window , and LinkedInOpens a new window .

MORE ON RECRUITMENT AND ONBOARDING

Â