UN report: Female AI voices like ‘Alexa’ perpetuate harmful gender stereotypes
Female default voices for artificial intelligence personal assistants may inadvertently reinforce gender stereotypes, according to a study published by United Nations Educational, Scientific, and Cultural Organization (UNESCO).
Default female voices for the devices, as well as names like Alexa and Siri, may precondition users toward antiquated views of women, according to the study. UNESCO also found assistants rarely have safeguards against abuse and gendered language. For example, Siri will respond to being told to make the user a sandwich with “I can’t. I don’t have any condiments,” according to the study. Insulting Siri prompts only the response “I’d blush if I could,” which is the title of the report.
{mosads}“Because the speech of most voice assistants is female, it sends a signal that women are … docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’. The assistant holds no power of agency beyond what the commander asks of it. It honors commands and responds to queries regardless of their tone or hostility,” the report states.
Amazon reportedly chose a female-sounding voice for its assistants based on market research suggesting it would be perceived as more sympathetic, whereas Microsoft named its own Cortana to build on existing name recognition for the character of the same name in its “Halo” video game series.
The report offers several recommendations to remedy the potential issues, including ending the practice of making all digital assistants female by default, programming them to discourage gendered insults and providing more opportunities for women and girls in tech fields to give them a seat at the table during the development of such technologies.
“Obedient and obliging machines that pretend to be women are entering our homes, cars and offices,” Saniye Gülser Corat, Director of Gender Equality at UNESCO, said in a statement. “Their hardwired subservience influences how people speak to female voices and models how women respond to requests and express themselves. To change course, we need to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them.”
Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..