Skip to main contentSkip to navigationSkip to navigation
Jennifer Bailey, vice-president of Apple Pay, introducing the Apple Card earlier this year.
Jennifer Bailey, vice-president of Apple Pay, introducing the Apple Card earlier this year. Photograph: Tony Avelar/AP
Jennifer Bailey, vice-president of Apple Pay, introducing the Apple Card earlier this year. Photograph: Tony Avelar/AP

Apple’s ‘sexist’ credit card isn’t just a PR problem – it’s a nightmare for us all

This article is more than 4 years old
Arwa Mahdawi

Men including Apple co-founder Steve Wozniak have been awarded significantly more credit that their wives. It’s a bleak insight into our algorithmic future

What is white, fragile and sexist? There are innumerable answers to that question, but the one I’m looking for in this instance is the Apple Card. The tech giant released its first branded credit card, managed by Goldman Sachs, earlier this year. The fancy titanium card is famously delicate; Apple has warned people not to store it in a leather wallet because it is easily damaged.

As it turns out, the card seems to be as allergic to women as it is to leather. Last week, David Heinemeier Hansson, a high-profile tech entrepreneur, tweeted that the card was “sexist” because it gave him 20 times more credit than his wife – seemingly for the sole reason of his gender. Hansson’s tweet went viral and Apple’s co-founder Steve Wozniak chimed in to say that he had also been given a much higher credit limit than his wife, even though the pair have no separate cards, accounts or assets. New York regulators are now investigating the claims of discrimination.

Goldman Sachs issued a vague statement on Sunday explaining that it looks at numerous factors when determining an individual’s creditworthiness, meaning two family members might receive a different credit decision. It stressed, however, that: “In all cases, we have not and will not make decisions based on factors like gender.”

This statement seems somewhat disingenuous. Like God, algorithms often work in mysterious ways, making opaque decisions not even the program’s creators can understand. Machine-learning algorithms don’t need to be told to take factors such as gender or race into account to make sexist or racist assumptions based on the biased data fed in. And this has worrying ramifications. Increasingly, every aspect of our lives, from our employability to our trustworthiness and our creditworthiness, is being influenced by these opaque algorithms. This isn’t just a PR problem for Apple and Goldman Sachs, it’s a nightmare for us all.

Most viewed

Most viewed