Facial Analysis Technology: Do Recruiters Really Need It?

essidsolutions

With major companies banning the sale of facial recognition technology for law enforcement, this may be a good time to examine the applications of facial analysis in recruitment and whether recruiters really need it.

Artificial intelligence (AI)-powered technology is only going to become more pervasive in recruitment and other HR functions. And why not? It promises to improve efficiency and accuracy and allow human recruiters to spend more time where it matters – in building relationships with candidates and ensuring they have a positive candidate experience.

Taking a data-driven approach to recruitment is now the norm. But how much data is necessary, and what are the guidelines for its interpretation? When facial recognition technology has been created with a seemingly large data set – albeit one that is still limited – does it simplify a recruiter’s job? It possibly can, or can it perpetuate more bias?

The aim of using facial recognition in recruitment, or as some experts call it – facial analysis or facial expression recognition – is to capture a candidate’s emotions because of the gap in what candidates may say and what they express through their face. But reading emotions is hard. And getting a machine to do it – while in theory seems like the most practical idea – is not without its challenges.

Facial recognition is part of a growing subset of artificial intelligence technology called emotion AI. When applied in recruitment, the technology goes a step further and becomes facial analysis or facial expression recognition technology. It deciphers facial expressions, even micro-expressions like eye movements, the curve and the corners of the mouth, and the body gestures. It also assesses tone of voice, for example, the enthusiasm in the voice, and helps recruiters evaluate whether a candidate may be a good fit for a job or organization. This begs a serious question.

Learn More: IBM Aims to Address Bias by Ending Its Facial Recognition BusinessesOpens a new window

Are We Using Facial Analysis Technology in Recruitment to Eliminate Bias, or Can It Perpetuate Bias Further?

Facial analysis is applied in different ways in recruitments. Some companies use providers that allow candidates to record their responses to a predefined set of questions. Following this, facial analysis tech is applied to identify a candidate’s suitability for the job. In other cases, the technology may be used in a live interview, where it assesses candidates as they respond to the interview questions.

This brings up serious concerns. Does the facial analysis software account for the context of a facial expression – for instance, the cultural significance of expressions, fear, awkwardness, inability to make contact in the case of perhaps a neurodiverse candidate?

One studyOpens a new window found that two separate facial recognition algorithms consistently highlighted black professional basketball players as having more negative emotions than white players, with one interpreting ambiguous facial expressions of black players as contemptuous. However, when all players smiled, regardless of race, the disparity between negative emotions reduced.

Consider the ramifications of such analysis in the workplace, where an algorithm consistently identifies individuals from certain groups as exhibiting negative emotions and, therefore, ineligible candidates for a job. AI algorithms pick up on micro-expressions, and if a candidate was in distress during the interview, those expressions might be captured differently from a smile of politeness or friendliness.

The growing use of AI in video assessments has led to a whole series on YouTube on how employees can prepare for such interviews. In South Korea, applicants are learningOpens a new window how to use AI bots that use facial analysis for assessments. Experts conduct talks that teach candidates to master AI interviews that assess them for “fear” and “joy.”

Some video AI assessments remove the human checker from the process entirely and may not even allow videos that don’t pass its checks to be reviewed by humans. And the real question to ask here is what the candidate is being assessed against? What is the standard that they are expected to meet? Has that been established anywhere? Does your own organization have it? And what does that person look like?

If all employees were to meet this standard, would it culminate in a company that has the same type of employee in the guise of a “cultural fit”? What does that do to eliminate bias in recruitment? Some expert opinions believe that the technology is too nascent to have a real impact and may result in increasing the divide rather than closing it.

A3 Facial recognition in recruitment to screen candidates is not a tested solution and companies should not be using it.#AXSChatOpens a new window – Most startups don’t have the resources or experts to create such solutions. Using them is a risk.

— Antonio Vieira Santos (@AkwyZ) May 26, 2020Opens a new window

The application of facial analysis, particularly in recruitment, can result in the perpetuation of bias, but the promise is that it can alleviate it, too. Human biases play out at a very subconscious level, and sometimes recruiters may just not like the “the look” or the “vibe” of a person if it doesn’t conform with their idea of what the ideal candidate should be. In such cases, a supposedly unbiased software can eliminate the initial layer of bias and possibly present a candidate for what they are.

Facial analysis software may prove useful for companies that need to hire for customer-facing roles. For instance, HireVue’s – a popular name in AI-powered recruitment video assessments – biggest clients are Hilton and Unilever, and they claim to have minimized their hiring timeOpens a new window from weeks to days. In 2019, HireVue also started training neurodiverse candidatesOpens a new window to ace video interviews. However, the company has received some negative publicity for its facial analysis technology for some of the reasons mentioned above.

Learn More: Do You Measure Culture Fit During Executive Recruitment?Opens a new window

So, How Much Does Facial Analysis Help a Recruiter?

Not enough data exists to assess whether facial analysis perpetuates bias or eliminates it. It demonstrates the potential for both.

The challenge with facial recognition and analysis is not just that the data is limited, but that AI is a field that already has a diversity crisisOpens a new window . Not enough women and other marginalized groups work in the field. So, the concern is that not enough diversity is translating into rigid outcomes from running AI algorithms. Facial recognition or any form of AI is not just as weak/strong as its data set, but also as weak/strong as who is preparing, creating, cleaning, and streamlining these data sets.

Facial analysis may not be most effective when applied in isolation. It will have to be combined with multiple methods and points of assessment. Human interviewers need to handhold technology as much as they are helped by it to make the right decisions. Spurred by the Black Lives Matter movement, diversity concernsOpens a new window are causing major upheaval in companies world over with prominent HR leaders stepping down from their positions.

And until facial analysis reaches an advanced level – or until we have enough data about its effectiveness – adopting a blind hiring + skills-based hiring approach may prove to be most effective in finding the right candidate for the job.

What is your understanding and opinion of the use of facial analysis in recruitment? Share your thoughts with us on LinkedInOpens a new window , TwitterOpens a new window , or FacebookOpens a new window .