Apple Card Raises Issue of Algorithmic Sex Discrimination

essidsolutions

Even artificial intelligence algorithms can be accused of sex discrimination. Just ask Apple Card.

The card system, introduced in March and run by Goldman SachsOpens a new window , is under investigation by New York state financial regulators following allegations that it discriminates against women.

The same card that has aimed to disrupt the credit card industry – with its lack of identifying numbers or fees and its promises of new levels of security and privacy – is now facing accusations that its algorithms are designed to extend more credit to men than women.

“My wife and I filed joint tax returns, live in a community-property state, and have been married for a long time,” saysOpens a new window David Heinemeier Hansson, the Danish software programmer and entrepreneur who first called attention to the discrepancy on Twitter. “Yet Apple’s black box algorithm thinks I deserve 20x the credit limit she does.”

Even Apple co-founder complains

After Hansson’s tweets went viral, Apple’s co-founder Steve Wozniak tweeted that he had a similar experience when he applied with his wife.

A spokesman for the New York State Department of Financial Services, alerted to the accusations, said that “any algorithm that intentionally or not results in discriminatory treatment of women or any other protected class of people violates New York law.”

Goldman Sachs denied it discriminates on basis of sex and said it issues credit cards founded only on creditworthiness.Opens a new window

Questions of discrimination in algorithms are becoming increasingly publicized as companies such as Apple, Goldman Sachs and Facebook grapple with the supposedly bias-free data-fed calculations and suspicions that they quietly factor in sex and race.

Regulator examination

In the case of Apple Card, whether a bias issue exists likely will be difficult to determine given the proprietary nature of credit card applications and the fact that spending and individual earnings can factor into a credit limit.

Still, the New York financial regulator that polices Wall Street said it will nevertheless scrutinize the criteria used by the algorithms for Apple Card and Goldman Sachs’ supervision of the card, according to Bloomberg.

In the United States, even unwittingly discriminating on the basis of sex or race is illegal. That means ensuring algorithms are constructed without bias is good business — and mandatory for staying on the right side of the law.

Facial Recognition Biases

It’s a lesson that designers of facial recognitionOpens a new window technology are also struggling with, following reports that the software is much more accurate at recognizing white faces than black ones.

One proposed reason is that facial databases underlying the tech lack a corresponding balance of white and black subjects, making African Americans vulnerable to potentially devastating misidentifications. The Institute of Electrical and Electronics Engineers is working to create standards for facial analysis software to resolve the issue.

In the credit industry, companies including Credit Kudos are working on more transparency in credit scoring by allowing customers to provide their own data rather than focusing on data harvested by the industry. These and similar other improvements would make it easier to identify whether an algorithm was designed with a bias.

Hansson, who created Ruby on RailsOpens a new window coding language for building web apps, tweeted his distress about the credit limits to his 350,000 followers, and the tweet went viral soon after, according to Business Insider, receiving more than 12,000 likes and 5,000 retweets in its first hours.

‘VIP bump’ promised

The day after Hansson publicly complained about the Apple Card, he said Apple’s customer service representatives told him the divergent credit limits for his wife and himself were the result of an elusive algorithm.

He said his wife got a “VIP bump” to match his credit limit, according to The New York Times, but it didn’t make up for the flawed algorithm used by Apple Card.

Hansson proclaimed that consumers deserve more insight into the credit-limit process and that Goldman, among others, should delineate its methodologies.

“It should be the law that credit assessments produce an accessible dossier detailing the inputs into the algorithm, provide a fair chance to correct faulty inputs, and explain plainly why difference apply,” Hansson wrote in a tweet. “We need transparency and fairness.”