Skip to main content

Having Siri, Google Assistant & other intelligent assistants female by default is sexist, says UN

Apple’s decision to make Siri’s voice female by default in the US and many other countries reinforces gender bias, claims a new United Nations report. The same complaint is levelled at Google Assistant, Amazon’s Alexa and Microsoft’s Cortana.

The problem with this, says the report, is that it ‘reflects and reinforces’ the idea that assistants – acting in a support role – are female …

The report is titled I’d blush if I could, which used to be one of Siri’s responses to being addressed as a slut.

Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’. The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.

As voice-powered technology reaches into communities that do not currently subscribe to Western gender stereotypes, including indigenous communities, the feminization of digital assistants may help gender biases to take hold and spread. Because Alexa, Cortana, Google Home and Siri are all female exclusively or female by default in most markets, women assume the role of digital attendant, checking the weather, changing the music, placing orders upon command and diligently coming to attention in response to curt greetings like ‘Wake up, Alexa’.

The report is particularly concerned about the subconscious message sent to children when IAs are female by default, as they are exposed to intelligent assistants from a young age.

Professor Noble says that the commands barked at voice assistants – such as ‘find x’, ‘call x’, ‘change x’ or ‘order x’ – function as ‘powerful socialization tools’ and teach people, in particular children, about ‘the role of women, girls, and people who are gendered female to respond on demand’. Constantly representing digital assistants as female gradually ‘hard-codes’ a connection between a woman’s voice and subservience.

According to Calvin Lai, a Harvard University researcher who studies unconscious bias, the gender associations people adopt are contingent on the number of times people are exposed to them. As female digital assistants spread, the frequency and volume of associations between ‘woman’ and ‘assistant’ increase dramatically.

According to Lai, the more that culture teaches people to equate women with assistants, the more real women will be seen as assistants – and penalized for not being assistant-like. This demonstrates that powerful technology can not only replicate gender inequalities, but also widen them.

A secondary issue, it argues, is the way that IAs respond flirtatiously to offensive comments.

In 2017, Quartz investigated how four industry-leading voice assistants responded to overt verbal harassment and discovered that the assistants, on average, either playfully evaded abuse or responded positively. The assistants almost never gave negative responses or labelled a user’s speech as inappropriate, regardless of its cruelty.

Siri, for example, used to respond to “You’re a slut” with replies which included “I’d blush if I could” and “Well, I never!” Apple has since changed the response to “I don’t know how to respond to that.”

Siri defaults to a female voice in most countries, but, curiously, has some exceptions.

Where the language is Arabic, French, Dutch or British English, Siri defaults to a male voice.

An Indiana university study back in 2017 found that both men and women prefer a female voice, finding it welcoming, warm and nurturing – however, preferences varied by content in rather stereotypical ways.

FTC: We use income earning auto affiliate links. More.

HP Memorial Day Sale
You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Photo: Shutterstock


Check out 9to5Mac on YouTube for more Apple news:

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear