How smart speaker AIs such as Alexa and Siri reinforce gender bias
Unesco urges tech firms to offer gender-neutral versions of their voice assistants
Smart speakers powered by artificial intelligence (AI) voice assistants that sound female are reinforcing gender bias, according to a new UN report.
Research by Unesco (United Nations Educational, Scientific and Cultural Organisation) found that AI assistants such as Amazon’s Alexa and Apple’s Siri perpetuate the idea that women should be “subservient and tolerant of poor treatment”, because the systems are “obliging and eager to please”, The Daily Telegraph reports.
The report - called “I’d blush if I could”, in reference to a phrase uttered by Siri following a sexual comment - says tech companies that make their voice assistants female by default are suggesting that women are “docile helpers” who can be “available at the touch of a button”, the newspaper adds.
The agency also accuses tech companies of failing to “build in proper safeguards against hostile, abusive and gendered language”, reports The Verge.
Instead, most AIs respond to aggressive comments with a “sly joke”, the tech news site notes. If asked to make a sandwich, for example, Siri says: “I can’t. I don’t have any condiments.”
“Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation,” says the Unesco report.
What has other research found?
The Unesco report cites a host of studies, including research by US-based tech firm Robin Labs that suggests at least 5% of interactions with voice assistants are “unambiguously sexually explicit”.
And the company, which develops digital assistants, believes the figure is likely to be “much higher due to difficulties detecting sexually suggestive speech”, The Guardian reports.
The UN agency also points to a study by research firm Gartner, which predicts that people will be having more conversations with the voice assistant in their smart speaker than their spouses by 2020.
Voice assistants already manage an estimated one billion tasks per month, ranging from playing songs to contacting the emergency services.
Although some systems allow users to change the gender of their voice assistant, the majority activate “obviously female voices” by default, the BBC reports.
The Unesco report concludes that this apparent gender bias “warrants urgent attention”.
How could tech companies tackle the issue?
Unesco argues that firms should be required to make their voice assistants “announce” that they are not human when they interact with people, reports The Sunday Times.
The agency also suggests that users should be given the opportunity to select the gender of their voice assistant when they get a new device and that a gender-neutral option should be available, the newspaper adds.
In addition, tech firms should program voice assistants to condemn verbal abuse or sexual harassment with replies such as “no” or “that is not appropriate”, Unesco says.
Tech companies have yet to respond to the study.