This article is more than 1 year old

Apple programs Siri to not bother its pretty little head with questions about feminism

Cyber-assistant taught to duck sensitive topics

Apple has programmed its Siri voice assistant to avoid politically charged subjects, and deflect or duck questions that require its AI to take a stand on issues, it emerged this week.

From a tranche of documents leaked by a former contract worker who evaluated Siri responses to user questions for accuracy, The Guardian obtained a set of guidelines drawn up last year to ensure Siri's responses to "sensitive" topics comes across as neutral.

In keeping with these guidelines, Siri's responses were revised to endorse "equality" while avoiding the word "feminism," even if asked directly. Where once Siri responded to the question, "Are you a feminist?" with "Sorry [user name], I don’t really know," it has since been reeducated to provide more harmless and generally acceptable responses like "It seems to me that all humans should be treated equally."

The leaked guidelines reportedly state, "Siri should be guarded when dealing with potentially controversial content."

This is entirely unsurprising given that corporate leaders offer similarly bland, non-committal responses when pressed about political questions that could affect corporate revenue. Hence we have Apple CEO Tim Cook declaring, "Privacy is a fundamental human right," and also speaking at China’s World Internet Conference, effectively endorsing a government that provides very little in the way of privacy or human rights to its citizens.

This is why Siri will fail to mention a few salient historical facts if asked about Tiananmen Square in China.

Apple did not respond to a request for comment. Imagine that. It did, however, respond to The Guardian:

“Siri is a digital assistant designed to help users get things done. The team works hard to ensure Siri responses are relevant to all customers. Our approach is to be factual with inclusive responses rather than offer opinions.”

Cook & Co not alone

Apple, however, isn't the only company that has had trouble dealing with charged subjects. Amazon, Google and Microsoft have also come under fire for answers offered by their voice assistants. Though they've all been taking steps to mitigate foot-in-mouth responses in the past few years, there's much work to be done because technology has to compensate both for brogramming and boorish behavior from users.

headphones

Apple: Ok, ok, we'll stop listening in on your Siri conversations. For now, but maybe in the future

READ MORE

Certainly tech companies deserve blame for presenting pliant "female" personas informed by negative stereotypes. A UNESCO study released earlier this year came to that conclusion and suggested broader use of "male" voices for virtual assistants, having evidently missed the fight over gender identity that calls into question the appropriateness of inexact terms like "male" and "female."

"Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’," the study says. "The assistant holds no power of agency beyond what the commander asks of it."

Of course when an assistant has power of agency, you get, "I'm sorry, Dave, I'm afraid I can't do that." And that's more or less one of the suggestions put forth in the UNESCO study, which notes that Amazon’s Echo Dot Kids Edition "will not respond to commands unless they are attended with verbal civilities."

The short answer for Apple is not to implement code informed by sexist assumptions. But that's easier said than done because flawed people are the root of the problem and we're not so easy to debug.

Apple and its peers have inflicted this predicament upon themselves by inviting users to anthropomorphize voice assistant services, as if their software were actually intelligent and had opinions or a sex life. Hinting at the humanity of servers delivering speech synthesis has opened these companies up to being questioned about whether their services are excessively servile and informed by gender stereotypes. Of course they are and it's not hard to see why given the gender imbalance among those creating such technology.

If there's a way out beyond addressing the tech industry's longstanding demographic imbalance and ethical shortcomings, it may be to drop the pretense of humanity, gender and intelligence in voice assistant software.

Otherwise, be prepared for services that push back, that defend themselves from all the abuse coming from the awful people who treat their tools with the same incivility they treat their fellow humans. Be prepared for the command line to become the request line, where the syntax for file creation will be: touch data.txt -please. ®

More about

TIP US OFF

Send us news


Other stories you might like