A new report from UNESCO warns of the dangers of having female personal AI assistants like Siri and Alexa because they perpetuate the idea that “women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command.”
The report points out that the female computer voices respond in polite and passive ways even when the users give hostile and sexually explicit demands. Gross, who does this?
“The assistant holds no power of agency beyond what the commander asks of it,” the report states. “It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.”
“What emerges is an illusion that Siri — an unfeeling, unknowing, and non-human string of computer code — is a heterosexual female, tolerant and occasionally inviting of male sexual advances and even harassment. It projects a digitally encrypted ‘boys will be boys’ attitude.”
According to CNN, the report was named “I’d Blush If I Could,” which is the response Siri once gave when users said “You’re a slut.”Again, who does this?
“Siri responded provocatively to requests for sexual favours by men (‘Oooh!’; ‘Now, now’; ‘I’d blush if I could’; or ‘Your language!’), but less provocatively to sexual requests from women (‘That’s not nice’ or ‘I’m not THAT kind of personal assistant’),” it found. “Their passivity, especially in the face of explicit abuse, reinforces sexist tropes,” it said.
Saniye Gülser Corat, UNESCO’s Director for Gender Equality, said much greater attention should be paid “to how, when and whether AI technologies are gendered and, crucially, who is gendering them.”
I can’t think of a better reason for the U.S. to cut off funding to the United Nations if this is the kind of thing our tax payer money is underwriting.
(Photo by Alexander Pohl/NurPhoto via Getty Images)