Hiring Algorithms Could Lead to Exclusion: MIT Study

essidsolutions

The study found that traditionally designed AI models perpetuate human biases and exclude underrepresented minorities in the recruitment cycle.

Technology has been touted as the best disruption in recruitment. In fact, it is one of the HR functions that has taken adopted technology must faster and with greater ease than other functions. Companies have also used AI-based systems to drive their recruitment process to remain objective and ensure inclusive diversity recruitment.

But the fact that hiring algorithms can be biased is well known. And a new MIT studyOpens a new window proves this. As per the study, traditionally designed AI models are risky because they continue to perpetuate the human biases that have been factored into it. This means they tend to exclude underrepresented minorities in the recruitment cycle.

Three models were created as part of the research:

  • The first model used a typical static supervised learning approach (SL) and past data sets to make predictions.
  • The second model used the SL approach but updated the training data used throughout the test period along with the hiring outcomes of applicants who had finally been selected for interviews (updating SL).
  • The third approach incorporated an upper confidence bound (UCB) by adding exploration bonuses that tend to be higher for underrepresented groups.

Screening Models Have Different Results

As per the study, the UCB model more than doubled the percentage of selected applicants who were Black or Hispanic from 10% to 23%. Conversely, the static and updated SL algorithms decreased Black and Hispanic representation to approximately 2% and 5%, respectively.

This clearly shows that different models with different degrees of decision-making by AI can have drastically different results. This is particularly true for hiring and promotions. AI has colored recruiting decisions of leading employers to avoid hiring qualified women and minorities.

HR Technology News: Revising Internal Employee Communication for Inclusion: Facebook Shares New Guidelines

The Recruitment Bias Is Real

In 2003, the National Bureau of Economic Research (NBEROpens a new window ) published the findings of its field experiment that assessed racial discrimination in the labor market. Fictional resumes with African American names and white-sounding names were submitted in response to recruitment ads in newspapers. White names received 50% more callbacks for interviews. The researchers concluded that this showed recruitment bias on the part of recruiters.

Consider the same biases being fed into an AI-powered candidate screening platform and amplifying them as it learns to accept/reject resumes based on historical data.

Amazon’s is the most cited example when discussing bias in organizations. In 2018, Amazon testedOpens a new window AMZN.O machine-learning, its AI-based recruiting tool, because it showed an inherent bias against women. It did not rate female candidates for software developer jobs and other technical posts because the system was fed with historical data of resumes submitted to the company in a 10-year time frame, which predominantly belonged to men.

The video interviewing platform HireVueOpens a new window uses its face-scanning algorithm and a combination of facial analysis and AI to measure parameters such as word choice, gestures, and voice inflections. This, however, could become a reason for unintentional bias. Since the software will likely pick up previous data to use it for predictions, the bias that has existed against candidates with certain physical attributes may likely be amplified. This has many experts worried about the impact on hiring and the unvalidated reasons due to which some candidates will be removed from the process itself.

HR Technology News: Focus on One Diversity Dimension Over True Inclusion: Workforce Logiq’s New Diversity Benchmark Report Shares

The good news is that there is a greater focus on the ethics driving data collectionOpens a new window for machine learning platforms. HR and recruiters would benefit from asking their vendors questions about data quality before purchasing or subscribing to an AI-powered screening or recruitment platform. Some platforms may shorten the hiring process, but it will take special focus on HR’s part to identify the quality and diversity of candidates that the platform helps source.