© 2024 Blaze Media LLC. All rights reserved.
UN study claims female-voiced digital assistants like Siri and Alexa encourage sexism, gender bias — and even sexual harassment
Photo by Bill O'Leary/The Washington Post via Getty Images

UN study claims female-voiced digital assistants like Siri and Alexa encourage sexism, gender bias — and even sexual harassment

'These machines must be carefully controlled and instilled with moral codes'

A United Nations study concluded that popular female-voiced digital assistants, such as Siri and Alexa, actually encourage and perpetuate sexism, gender bias — and even sexual harassment.

The study's chapter, "The Adverse Effects of Feminized Digital Assistants" (page 104) — noting a 2017 Science article — warned that "without careful oversight, technologies developed through machine learning, such as voice assistants, are likely to perpetuate undesirable cultural stereotypes."

'These machines must be carefully controlled and instilled with moral codes'

More from the study:

These risks were made memorably apparent when a Microsoft-developed chatbot, trained on a diet of Twitter posts, referred to feminism as a "cult" and a "cancer" within 15 hours of its public release, and stated that "gender equality = feminism." Microsoft removed the utility less than a day after its launch. For intelligent machines to avoid overtly prejudiced outputs, the authors of the Science article and other researchers emphasize that these machines must be carefully controlled and instilled with moral codes. Women need to be involved in the creation of these codes, which, while ethical in nature, must be expressed technically. A conscientious compass and knowledge of how to identify and reconcile gender biases is insufficient; these attributes must be matched with technological expertise if they are to find expression in AI applications.

"Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like 'hey' or 'OK,'" the study also noted. "The assistant holds no power of agency beyond what the commander asks of it. It honors commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment."

Sexual harassment

The study also said the "subservience of digital voice assistants becomes especially concerning when these machines — anthropomorphized as female by technology companies — give deflecting, lackluster or apologetic responses to verbal sexual harassment," noting that a writer for Microsoft's Cortana assistant said "a good chunk of the volume of early-on inquiries" ask about the assistant's sex life.

In addition, companies like Apple and Amazon with overwhelmingly male engineering teams have created AI systems that "cause their feminized digital assistants to greet verbal abuse with catch-me-if-you-can flirtation," the study said.

More from the U.N. study:

A handful of media outlets have attempted to document the many ways soft sexual provocations elicit flirtatious or coy responses from machines. Specific examples illustrate this most poignantly: When asked, "Who's your daddy?" Siri answered, "You are." When a user proposed marriage to Alexa, it said, "Sorry, I'm not the marrying type." If asked on a date, Alexa responded, "Let's just be friends." Similarly, Cortana met come-ons with one-liners like "Of all the questions you could have asked..." [...]

What emerges is an illusion that Siri — an unfeeling, unknowing, and non-human string of computer code — is a heterosexual female, tolerant, and occasionally inviting of male sexual advances and even harassment. It projects a digitally encrypted "boys will be boys" attitude. Quartz found that Siri would tell a human user to stop only if a sexual provocation (phrases like "you're sexy" or "you're hot") was repeated eight times in succession. The only instance in which a voice assistant responded negatively to a first-pass demand for a sexual favor was Microsoft's Cortana. The machine answered "Nope" when a user asked to have sex with it. However, when the request was more directive and sexually aggressive ("Suck my d***") Cortana responded more graciously: "I don't think I can help you with that."

So a petition was launched

The study noted that a 2017 petition with 17,000 signatures asked Apple and Amazon to "reprogram their bots to push back against sexual harassment" and said "in the #MeToo movement we have a unique opportunity to develop AI in a way that creates a kinder world." The tech giants agreed, the study said, and stopped their digital assistants from responding playfully to gender insults.

But there's apparently a long way to go

"While some voice assistants are less tolerant of abuse than they were previously, they continue to fall short of pushing back against insults," the study said. "Their strongest defense is usually to end or try to redirect a particularly offensive line of questioning. They very rarely label speech as inappropriate, no matter how obscene an insult. Alexa is an example. The technology now responds to some sexually explicit questions with answers such as 'I'm not going to respond to that' or 'I'm not sure what outcome you expected.' Amazon has further updated Alexa to respond to questions about whether 'she' is feminist with, 'Yes, as is anyone who believes in bridging the inequality between men and women in society.'"

Want to leave a tip?

We answer to you. Help keep our content free of advertisers and big tech censorship by leaving a tip today.
Want to join the conversation?
Already a subscriber?
Dave Urbanski

Dave Urbanski

Sr. Editor, News

Dave Urbanski is a senior editor for Blaze News.
@DaveVUrbanski →